Test Plan. Version Created

Similar documents
Design Document. Version Created

Module 4: Advanced Development

Oracle Eloqua HIPAA Advanced Data Security Add-on Cloud Service

Pure Storage FlashArray Management Pack for VMware vrealize Operations Manager User Guide. (Version with Purity 4.9.

We start by providing you with an overview of the key feature of the IBM BPM Process Portal.

Job Aid: Setting User Preferences

Chapter 8: SDLC Reviews and Audit Learning objectives Introduction Role of IS Auditor in SDLC

Lecture 15 Software Testing

QA Best Practices: A training that cultivates skills for delivering quality systems

Improved Database Development using SQL Compare

Recovering Oracle Databases

NinjaTrader Vendor License Management Help Guide

HPE Security Fortify Plugins for Eclipse

Product Security Program

Hands-on Lab Session 9909 Introduction to Application Performance Management: Monitoring. Timothy Burris, Cloud Adoption & Technical Enablement

Chapter 8 Software Testing. Chapter 8 Software testing

Dataworks Development, Inc. P.O. Box 174 Mountlake Terrace, WA (425) fax (425)

Sample Exam. Certified Tester Foundation Level

Client-server application testing plan

SRM ARTS AND SCIENCE COLLEGE SRM NAGAR, KATTANKULATHUR

Test Planning Guide. Summary: The purpose of this document is to define and document requirements in order to create and execute a test plan.

Oracle Financial Services Governance, Risk, and Compliance Workflow Manager User Guide. Release February 2016 E

Project Manager User Manual

Software Testing Interview Question and Answer

HP Business Service Management

RISKMAN QUICK REFERENCE GUIDE TO SYSTEM CONFIGURATION & TOOLS

Tenable.io User Guide. Last Revised: November 03, 2017

Project Management with Enterprise Architect

User Guide Worksoft Certify Integration with SAP Solution Manager v7.2

ONLINE VIRTUAL TOUR CREATOR

Numara FootPrints Changelog January 26, 2009

Perceptive Intelligent Capture Visibility

Sample Exam. Advanced Test Automation Engineer

Production Assistance for Cellular Therapies (PACT) PACT Application System User s Guide

Brochure. Security. Fortify on Demand Dynamic Application Security Testing

COMMON CRITERIA CERTIFICATION REPORT

Testing with Soap UI. Tomaš Maconko

Sample Exam Syllabus

Part 5. Verification and Validation

Web Applications (Part 2) The Hackers New Target

Chapter 9. Software Testing

COMMON CRITERIA CERTIFICATION REPORT

Software Engineering (CSC 4350/6350) Rao Casturi

Code Reviews. James Walden Northern Kentucky University

Digitized Engineering Notebook

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

OWASP RFP CRITERIA v 1.1

Customizing and Administering Project Server Access

Sample Exam. Advanced Test Automation - Engineer

Automated Medical Patient Evaluation System - Phase 2 Design Report

ArtOfTest Inc. Automation Design Canvas 2.0 Beta Quick-Start Guide

TECHNICAL NOTE VidyoServer Security Update 3

HPE Security Fortify Plugins for Eclipse Software Version: Installation and Usage Guide

1. Installing R4E 1. 1) Provision Software Sites 2. 2) Install Version Control System Features 3. 3) Install R4E feature 4. 4) Install Versions

Migrating Data from Archivists Toolkit to ArchivesSpace Using the Migration Tool last updated December 19, 2017

Tutorial to Building Automation Frameworksfor Web Services Testing

Types of Software Testing: Different Testing Types with Details

CollabNet TeamForge 5.3 Evaluator s Guide

Zend Studio 3.0. Quick Start Guide

Web Security Vulnerabilities: Challenges and Solutions

Testing is executing a system in order to identify any gaps, errors, or missing requirements in contrary to the actual requirements.

ESDA-Operation Manual Version 1.0. E-platform for School Development & Accountability (ESDA) Operation Manual. Nov [Version 1.

Service Manager. Ops Console On-Premise User Guide

Examining the Code. [Reading assignment: Chapter 6, pp ]

Streaming Presentation Manager User Guide

CA Test Data Manager Key Scenarios

Spillemyndigheden s requirements for accredited testing organisations. Version of 1 July 2012

Chapter 10. Testing and Quality Assurance

COURSE PROFILE: ENVISION USER TRAINING

TIES Usage Policies. for University of Pittsburgh. Authors. University of Pittsburgh

Oracle Enterprise Manager Oracle Database and Application Testing. Application Testing Suite Lab. Session S318966

Observatory Control System Test Plan Steve Wampler High-Level Software

In this Lecture you will Learn: Testing in Software Development Process. What is Software Testing. Static Testing vs.

Branching with Eclipse and CVS

Talend Open Studio for MDM Web User Interface. User Guide 5.6.2

One Identity Active Roles 7.2. Replication: Best Practices and Troubleshooting Guide

Perslink Security. Perslink Security. Eleonora Petridou Pascal Cuylaerts. System And Network Engineering University of Amsterdam.

Acceptance Test Plan

User Administration. User Administration Help

Client-side Development using HTML, Javascript and CSS

CSCIE-275. Guide for Chief Programmers

Curriculum Guide. Integrity 11

Oracle Learn Cloud. What s New in Release 15B

New Features Guide Sybase ETL 4.9

Sonatype CLM - IDE User Guide. Sonatype CLM - IDE User Guide

Example 1 Simple Broadcast: A broadcast is sent to a target audience. The may contain a link directing the user to a web page.

Circular Logic. Robotic Tram Data Collection System Software Configuration Management Plan Version 2.3 4/8/ Circular Logic

Excellence in Research for Australia (ERA) SEER

Question 1: What is a code walk-through, and how is it performed?

Oracle Eloqua and Salesforce

Continuity Planning User Guide

Mobile ios Configuration Guide

Data Management Unit, V3.1 University of Pennsylvania Treatment Research Center, 2010 Page 2

VHIMS QUICK REFERENCE GUIDE TO SYSTEM CONFIGURATION & TOOLS

U.S. Pharmacopeia Pharmacopeial Forum. USP-PF Online Quick Start Guide

Quality Management Plan (QMP)

Credit Card Data Compromise: Incident Response Plan

Installing Komplete 5 with Direct Install

User Documentation Development Life Cycle (UDDLC)

Why testing and analysis. Software Testing. A framework for software testing. Outline. Software Qualities. Dependability Properties

Transcription:

Test Plan Version 1.0 2008.10.24 Created 2008.10.14 Yahoo! Property View Rob Shaw Team Leader Jacob McDorman Project Leader Robert Read Technologist Brad Van Dyk Editor

Table of Contents [1] Introduction... 1 [1.1] Document Scope... 1 [1.2] Intended Audience... 1 [1.3] Project Identification... 1 [2] Software Quality Goals... 1 [2.1] Overview... 1 [2.2] Metrics... 2 [2.3] Tools... 2 [3] Testing Processes... 3 [3.1] Overview... 3 [3.2] Version Control... 3 [3.3] Code Reviews... 3 [3.4] Other Test Processes... 4 [4] Test Cases... 4 [4.1] Unit Test Cases... 4 [4.1.1] Unit Test Case 1:... 4 [4.1.2] Unit Test Case 2:... 4 [4.1.3] Unit Test Case 3:... 4 [4.2] Integration Test Cases... 5 [4.2.1] Integration Test Case 1: Testing User Data... 5 [4.2.2] Integration Test Case 2: Testing Added/Removed Property... 5 [4.2.3] Integration Test Case 3: Stop Alerts until Specified Time... 6 [4.2.4] Integration Test Case 4: Testing Update of User Settings... 7 [4.2.5] Integration Test Case 5: Test Login and Connection with Read Capabilities... 7 [4.3.1] UI Test Case 1: Logging in... 8 [4.3.2] UI Test Case 2: Manual Refresh... 8 [4.3.3] UI Test Case 3: User edits data that is writable to the database... 8 [4.3.4] UI Test Case 4: Setting Alert Options... 9 [4.3.5] UI Test Case 5: Receiving Alerts... 9 Page ii

[4.3.6] UI Test Case 6: Stop Alerts Until Specified Time... 9 [5] Appendix... 10 Page iii

[1] Introduction [1.1] Document Scope This document describes the testing plan that will be put into place and followed by Team E working on the Yahoo! Property View widget. This document contains a detailed outline of the guidelines that will be followed as well as testing processes and test cases that will be used in conjunction of developing the Yahoo! Property View widget. Section two of this document pertains to Software Quality Goals. Described in this section are metrics and tools that will be used by the development team to assess the quality of the software that is produced. Testing processes (section three) are described in this document with information relating to how the development team will doing version control as well as code reviews on the project as a whole. The final section of the testing plan document pertains to test cases and how they will be used in conjunction with development of our components. [1.2] Intended Audience This document is not intended for widget users and will not give any information on how to use the widget. This document is intended for stakeholders (persons affected by the project) and the software development team (Purdue Team E). [1.3] Project Identification As stated in the requirements document 1.4.1, this project will provide a solution to Yahoo s difficulty with accessing data collected from the various systems offered by Yahoo. The domain of this widget is statistic analysis, and as such, the motivation of this software is to aid the user with analysis by providing trending data, volume data, and alerting mechanisms. [2] Software Quality Goals [2.1] Overview The Property View widget shall be developed to be used internally by Yahoo!. Therefore, the code should be maintainable, well documented, and easy to read. Code inspections shall be done to assess whether or not any vulnerabilities exist within the system that could harm the user or the Yahoo! network. The Property View widget shall be developed such that any SQL statements that are sent to the Yahoo! Oracle database will be checked for malicious attempts such as SQL injection. Page 1

[2.2] Metrics Metrics shall be performed on the code at intervals spread out throughout the remainder of the development cycle. A metric that will be used in the analysis of the Property View widget s code is code coverage. Code coverage describes the degree to which the source code of a program (Property View widget) has been tested. It is a form of testing that inspects the code directly and is therefore a form of white box testing. The coverage criteria that will be used to measure how well the program is being used are listed as such: Function coverage: Has each function in the program been executed? Statement coverage: Has each line of the source code been executed? Decision coverage: Has each control structure (ex: if statement) evaluated to both true and false? Condition coverage: Has each Boolean sub expression evaluated both to true and false? Path coverage: Has every possible route through a given part of the code been executed? Entry/exit coverage: Has every possible call and return of the function been executed? Coverage criteria will be evaluated with the Fortify software mentioned earlier as well as a program called JSCoverage which was developed to test the coverage of Javascript code. There are also some other Javascript Coverage programs which have been researched and could potentially be used. Another metric that will be used in the analysis of the Property View widget is source lines of code (SLOC). SLOC will be used to measure the size of the program by counting the number of lines in the source code. This will be used to give a rough estimation of the amount of time that has already been required to develop the widget as well as estimations as to how much effort shall be needed on the remaining components. Cohesion and coupling are also metrics which be used to help analyze the Property View widget. This will be analyzed in an effort to determine how much each of the components rely on each other as well as how strongly related the responsibilities of a given class are. [2.3] Tools The Fortify Software package will be used as a tool to analyze code written in the development of the Yahoo! Property View widget. The current program under the software package called the Fortify Source Code Analyzer (SCA) is a tool in which the team has access to and shall use when writing code. The Fortify SCA examines every line of code and every program path to identify hundreds of different types of potentially exploitable vulnerabilities. Page 2

JSCoverage is a tool that shall be used to help analyze the coverage code of the Property View widget. It is a tool that shall be run periodically to see how well the code is executing. Reports shall be created based on the results of the test. [3] Testing Processes [3.1] Overview As discussed in the Overview of Software Quality Goals (current document, section 2.1), the Property View widget shall be developed to be used internally by Yahoo!. The development team shall adhere to those guidelines (of the SQG section) that were outlined as well as follow the practices described below. [3.2] Version Control According to section 3 (Version Control) of the Implementation Plan, a CVS repository shall be set up by the development team to be used in conjunction with construction of the Property View widget. Each team member shall be given a separate branch in which to develop code in. All team members shall have the ability to view the code of another team member. Individual team members will be given a task (that can possibly be shared with other team members depending) and during ongoing development of said task, the code shall be updated within the user s branch. Versioning and committing of code to the head branch will be incremented upon the completion of a new code base that has passed code reviews, unit testing, and regression testing. For a new version of software to be incremented, it must meet one or more of the criteria discussed in Section 3 of the Implementation Plan which are: 1. Yields better test results. 2. Fixes known bug(s). 3. Provides an improved look and/or feel. 4. Adds new features (not previously implemented). As stated in the Implementation Plan, Eclipse shall be used by each team member that has access to the team s CVS repository. [3.3] Code Reviews All code developed by team members must follow the guidelines outlined in the Implementation Plan describing coding standards (section 5). These guidelines have information pertaining to the styling and commenting of code. Code reviews shall be a very important task involved with the development of the Property View widget. Following the Gantt Chart provided in the Implementation Document (Section 2), there is time allotted for testing of features that were recently implemented. This time frame will be used to conduct code reviews among other types of testing involved with development of the widget. Page 3

As code changes are made by a certain team member, there will be a document in which must be filled out to highlight changes that are made with references to line numbers and how they relate to the requirement and design documents. The team member must do individual testing before submitting the code to be processed in a code review. [3.4] Other Test Processes Tests to be run shall be automated. Scripts shall be written to test individual components. However, there shall be some manual testing of the GUI interface that cannot be tested with automated scripts. [4] Test Cases [4.1] Unit Test Cases Will need some separate functions to help with unit testing. One function will be needed to dump all the data pulled from a SQL statement to a file for quick and easy viewing. [4.1.1] Unit Test Case 1: DatabaseComm Module or Class: Database Communicator 1. Testing: connect() method 2. Testing: query(string) method 3. Testing: insert() method [4.1.2] Unit Test Case 2: UserDataClass Module or Class: User Data 1. Testing: loaduserpreferences() method 2. Testing: SaveUserPreferences() method [4.1.3] Unit Test Case 3: TabDataManager Module or Class: Tab Data Manager 1. Automatic Updates a. Dump all data from MMT to a text file using SQL statements pulled from the user table. 2. Testing: enablealerts(boolean) method 3. Testing: stopalertsuntil(time) method Page 4

[4.1.4] Unit Test Case 4: TabViewer Module or Class: Tab Viewer 1. Refresh function. [4.2] Integration Test Cases Integration testing will be done using Usage Model testing. This strategy relies heavily on the team members to follow the isolated unit testing for individual components in which they work on. The goal of the strategy is to avoid redoing the testing already done, and instead flesh out problems caused by the interaction of the components in the environment. [4.2.1] Integration Test Case 1: Testing User Data Preconditions: User is valid in database with some property data. 1 Call is made to connect to database with user name. Result: Successful connection, no errors logging in. 2 Call is made to grab data from MMT relating to current user name. Result: User data is downloaded and stored in a temporary text file (this is to check whether or not data was downloaded correctly). [4.2.2] Integration Test Case 2: Testing Added/Removed Property Preconditions: User is valid in database with some property data. Property has been added to user s table. New data is available to be added to test new property. Connection has already been made with current user. 1 Call is made to refresh user information from user table. Result: Current data for user as well as new SQL statements are downloaded. 2 Call is made to get information from MMT using user s SQL statements with data filled in pertaining to properties in which they belong. Result: A dump of all data is created in a temporary text file to check for the validity of data. Information from the newly created property shall be downloaded. Page 5

3 Tester removes user from property in users table. Result: User belongs to one less property. 4 Call is made to refresh user information from user table. Result: Current data for user as well as new SQL statements are downloaded. 5 Call is made to get information from MMT using user s SQL statements with data filled in pertaining to properties in which they belong. Result: A dump of all data is created in a temporary text file to check for the validity of data. Information from the newly created property shall not be downloaded. [4.2.3] Integration Test Case 3: Stop Alerts until Specified Time Preconditions: User is valid in database with some property data. Data is added to test new anomaly alert. Connection has already been made with current user. 1 Date and time are specified in a variable (make sure it s only a few minutes in the future for easy testing purposes). Call is made to change current user settings. Result: Current users next specified date/time to receive alerts is updated. 2 Add new anomaly information to the database (must be a part of a property that the current user belongs to). Check the alerts log. Result: No new alerts should be displayed. 3 Wait until date/time specified in variable. Call checkforalerts(). Check the alerts log. Result: New alert should be created with information pertaining to data that was added in step 2. Page 6

[4.2.4] Integration Test Case 4: Testing Update of User Settings Preconditions: User is already logged in. User has settings already defined. 1 Set receivealerts variable to true. Call updateusersettings(). Result: Message of successful change returned. 2 Add new information to anomaly database. Call checkforalerts(). Result: New alert should be created with information pertaining to property in which user belongs. 3 Set receivealerts variable to false. Call updateusersettings(). Result: Message of successful change returned. 4 Add new information to anomaly database. Call checkforalerts(). Result: No new alerts should have been created.. [4.2.5] Integration Test Case 5: Test Login and Connection with Read Capabilities Preconditions: Database login is valid to connect to database. 1 Set oracledbloginname =. Set oracledbpassword =. (The two above values are used to make the initial connection to Oracle database. These are different from usernames.) Call openoracleconnection(). Result: Message of Connection Open success. 2 Create SQL statement to dump first 10 users in database. Call runoraclecommand(sqlstm). Page 7

Result: First 10 users in Oracle Database User Table returned. [4.3] System and User Interface Test Cases [4.3.1] UI Test Case 1: Logging in Preconditions: User has started the program 1 User tries logging in with no username typed in the login display Result: Error message appears telling user to enter a username 2 User tries logging in with a username not found in the contacts table of the database Result: Error message appears saying that the username is not valid 3 User logs in with a valid username Result: Users tabs according to MMT are created and any options for that user saved from a previous login are loaded otherwise if the user has not previously logged in the default options are loaded [4.3.2] UI Test Case 2: Manual Refresh Preconditions: User has logged in 1 User clicks refresh button for specific tab Result: Tab s data is synchronized with the database data it is displaying and graphing. Any alerts arising from updated data are handled according to user s specifications for alerts regarding the tab 2 User clicks refresh all button Result: The data for each tab is synchronized with the database. Any alerts arising from the updated data are handled according to the alert settings chosen by the user for the tab(s) receiving alerts [4.3.3] UI Test Case 3: User edits data that is writable to the database Preconditions: User has logged in User has a tab open where there is an editable field 1 User inputs data of an invalid data type into editable field Page 8

Result: Error message appears saying the input value is an invalid data type 2 User inputs valid data into editable field Result: The table entry in the database that corresponds to the field edited by the user is updated with the user s input [4.3.4] UI Test Case 4: Setting Alert Options Preconditions: User has logged in User has the settings tab open. 1 User inputs data of an invalid data type into editable field Result: Error message appears saying the input value is an invalid data type 2 User inputs valid data into editable field Result: The table entry in the database that corresponds to the field edited by the user is updated with the user s input [4.3.5] UI Test Case 5: Receiving Alerts Preconditions: User has logged in User has a receive alerts optioned set to true. 1 Insert new information into database (creating anomaly) for property in which user does not belong. Result: No alert shall be displayed. 2 Insert new information into database (creating anomaly) for property in which user does belong. Result: Alert shall pop up with information pertaining to the anomaly. [4.3.6] UI Test Case 6: Stop Alerts Until Specified Time Preconditions: User has logged in User belongs to a property. 1 User selects time and date that is in the past (relevant to the current date and time). Page 9

Result: Error message shall be displayed saying a date was specified that is not in the future. 2 User selects time and date that is in the future. Result: Pop up will inform user that all alerts will be held until specified date as well as updating label that states when the next alert shall be processed. [5] Appendix Definitions: Property View widget The application being developed for Yahoo!. External Links: Tools Used by Development Team: Fortify Software http://www.fortify.com/ JSCoverage http://siliconforks.com/jscoverage/ Page 10