BugRep Software Test Plan- Second Iteration Academic College of Tel-Aviv Yaffo Software Engineering Course Year
2006-2007
Project lead names
Shmuel Tyszberowicz, Yair Aviv
Project name
BugRep
Document
Software Test Plan
Document Version
0.1
Last update date
12/06/2007
Date First Version
12/06/2007
Team Members: Name
ID#
e-mail
duty
Irit Lovenberg
033145558
[email protected]
Team Leader
Yael Shemla
034279125
[email protected]
Software configuration manager, SQA manager
Sagy Rozman
038356036
[email protected]
Implementation manager, Design manager
Chen Levkovich
033026055
[email protected]
Tests manager
034188334
[email protected]
Requirements manager
Galit Daniel
I
20/06/2007
BugRep Software Test Plan
Revision Sheet Revision Number
Date
Brief Summary of Defects
0.1
12/06/2007
Baseline draft document
Confirmation Sheet Author
Software Configuration Manager
SQA manager
Team Leader
Yael Shemla
Chen Levkovich
Irit Lovenberg
Yael Shemla
Chen Levkovich
Irit Lovenberg
12/06/2007
12/06/2007
12/06/2007
Name Galit Daniel Signature Galit Daniel Date 12/06/2007
II
20/06/2007
Contents REVISION SHEET..................................................................................................... II CONFIRMATION SHEET........................................................................................ II CONTENTS...................................................................................................................I 1. ............................................................................................................. INTRODUCTION 1 2. ........................................................................................................................... SCOPE 1 2.1 Identification..................................................................................................... 1 2.2 Document Overview ......................................................................................... 1 2.3 Acronyms and Definitions ................................................................................ 2 2.3.1 Acronyms .................................................................................................. 2 2.3.2 Definitions................................................................................................. 2 3. ........................................................................................... REFERENCED DOCUMENTS 6 4. .................................................................................. SOFTWARE TEST ENVIRONMENT 6 4.1 Development Test and Evaluation.................................................................... 7 4.1.1 Software Items........................................................................................... 7 4.1.2 Hardware and Firmware Items.................................................................. 7 4.1.3 Other Materials.......................................................................................... 7 4.1.4 Proprietary nature, acquirer’s rights, and licensing................................... 7 4.1.5 Installation, testing, and control ................................................................ 7 Installation Instructions.......................................................................................... 7 4.2 Test Site(s) ........................................................................................................ 8 5. ...................................................................................................TEST IDENTIFICATION 8 5.1 General Information ......................................................................................... 8 5.1.1 Test Levels ................................................................................................ 8 5.1.2 Test Classes ............................................................................................... 8 5.1.3 Test Progression ...................................................................................... 11 5.2 Planned Testing .............................................................................................. 12 5.2.1 Qualification Test.................................................................................... 12 5.2.2 Module Test............................................................................................. 13 5.2.3 Integration Test ....................................................................................... 13 5.2.4 Installation Beta Test............................................................................... 13 6. ..........................................................................................................TEST SCHEDULES 14 7. ..................................................................................................... RISK MANAGEMENT 14
I
20/06/2007
BugRep Software Test Plan 8. ....................................................................................REQUIREMENTS TRACEABILITY 14 9. ........................................................................................................................... NOTES 14 APPENDIX A // BUGREP SOFTWARE TEST REQUIREMENTS MATRIX.. 15
II
20/06/2007
BugRep Software Test Plan
1. Introduction This Software Test Plan document explains the test planning for the second iteration, that we have to do, in order to evaluate its' products quality, and for improving it, by identifying defects and problems.
2. Scope 2.1 Identification This software test plan is to detail the testing planned for the BugRep Project Version 02, 12/06/2007. The goal of BugRep development is to manage defects of systems. This System will allow our customers to report issues and bugs in company products. Specifics regarding the implementation of these modules are identified in the BugRep Software Requirements Specification (SRS) with line item descriptions in the accompanying Requirements Traceability Matrix (RTM).
2.2 Document Overview This document describes the Software Test Plan (STP) for the BugRep software. BugRep documentation will include this STP, the BugRep Software Requirements Specification (SRS), the Requirements Traceability Matrix (RTM), the BugRep Soft Project Plan (SPP), the Software Design Document (SDD), the BugRep User’s Manual. This STP describes the process to be used and the deliverables to be generated in the testing of the BugRep. This document is based on the BugRep Software Test Plan template with tailoring appropriate to the processes associated with the creation of the BugRep. The information contained in this STP is to be considered “For Official Use Only”.
1
20/06/2007
BugRep Software Test Plan
2.3 Acronyms and Definitions 2.3.1 Acronyms RTM SCM SDD SDR SEPG SMM SOW SPP SQA SRMP SRS STP STR STD CER TPR ITR WIMS
Requirements Traceability Matrix Software Configuration Management Software Design Document Software Design Review Software Engineering Process Group Software Measurement and Metrics Statement of Work Software Project Planning Software Quality Assurance Software Requirements Management Plan Software Requirements Specification Software Test Plan Software Test Report Software Test Description Change Enhancement Request Test Problem Report Internal Test Report Work Information Management System
2.3.2 Definitions alpha test
--Tests conducted at unit, integration, and qualification levels to remove defects and prove out reliability and satisfaction of requirements.
activity
“A major unit of work to be completed in achieving the objective of a software project. An activity has precise starting and ending dates, incorporates a set of tasks to be completed, consumes resources, and results in work products. An activity may contain other activities in a hierarchical manner.” [IEEE87]
baseline
“(1) A specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development, and that can be changed only through formal change control procedures. (2) A document or a set of such documents formally designated and fixed at a specific time during the life cycle of a configuration item. Note: Baselines, plus approved changes from those baselines, constitute the current configuration identification. (3) Any agreement or result designated and fixed at a given time, from which changes require justification and approval.”[IEE91]
beta test
--Tests conducted to determining actual usability of the product by having actual users at operation sites use the product
2
20/06/2007
BugRep Software Test Plan
configuration management
“A discipline applying technical and administrative direction and surveillance to: identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing implementation status, and verify compliance with specified requirements.” [IEEE90]
customer
“The individual or organization that specifies and accepts the project deliverables. The customer may be internal or external to the parent organization of the project, and may or may not be the end user of the software product. A financial transaction between the customer and developer is not necessarily implied.” [IEEE87]
developmental test
“Formal or informal testing conducted during the development of a system or component, usually in the development environment by the developer”[IEEE91]
function
“An activity that spans the entire duration of a software project. Examples of project functions include project management, configuration management, quality assurance, and verification and validation.” [IEEE87]
function points
—Metric used to determine size of effort for a project based on the “functionality” or “utility” of the project.
Function Point Method
—Method proposed by [Albrecht79] to determine the size of effort for completing a project based on an empirical relationship between the software’s information domain and assessments of the software’s complexity.
implementation
“(1) The process of translating a design into hardware components, software components, or both. (2) The result of the process in (1).”[IEEE91]
installation phase
“The period of time in the software life cycle during which a software product is integrated into its operational environment and tested in this environment to ensure that it performs as required.”[IEEE91]
integration test
“Testing in which software components, hardware components, or both are combined to evaluate the interaction between them.”[IEEE91]
module test
(1)A set of test inputs, execution conditions, and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement. (2) Documentation specifying inputs, predicted results, and a set of execution conditions for a test item.
plan
“A detailed scheme, program, or method worked out beforehand for the accomplishment of an objective.” [Heritage85] —Defined set of procedures and the required resources to implement a policy.
policy
“A course of action, guiding principle, or procedure considered to be expedient.” [Heritage85] —Corporate strategy, defines high-level goals.
3
20/06/2007
BugRep Software Test Plan
process
“A sequence of steps performed for a given purpose.” [IEEE90] —Activities and interfaces used to implement plan.
project
—unit of work to meet a specific customer requirement. Includes all tasks, activities, and functions necessary to meet the requirements.
project agreement
“A document or set of documents agreed to by the designated authority for the project and the customer. Documents in a project agreement may include some or all of the following: a contract, a statement of work, system engineering specifications, user requirements specification, functional specifications, the software project management plan, a business plan, or a project charter.” [IEEE87]
project deliverables
“The work product(s) to be delivered to the customer. The quantities, delivery dates, and delivery locations are specified in the project agreement.” [IEEE87]
program
—organization of personnel.
qualification test
“Testing conducted to determine whether a system or component is suitable for operational use.”[IEEE91]
quality assurance
“(1) A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements.” [IEEE90] “(2) A set of activities designed to evaluate the process by which products are developed or manufactured. ” [IEEE90]
review
—A process or meeting during which a work product, or set of work products, is presented to program personnel, managers, users, customers, or other interested parties for comment or approval. Types include code review, design review, formal qualification review, requirements review, test readiness review.
software
“Computer programs, procedures, and associated documentation and data pertaining to the operation of a computer system.” [IEEE90]
software engineering
“The application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software; that is, the application of engineering to software.” [IEEE90]
software project management
“The process of planning, organizing, staffing, monitoring, controlling, and leading a software project.” [IEEE87]
software quality assurance
—See quality assurance.
Software Test Plan
-- A document that specifies the test inputs, execution conditions, and predicted results for an item to be tested.
4
20/06/2007
BugRep Software Test Plan
specification
“A document that specifies, in a complete, precise, verifiable manner, the requirements, design, behavior, or other characteristics of a system or component, and, often, the procedures for determining whether these provisions have been satisfied.” [IEEE90]
statement of work
—Description of all the work required to complete a project, which is provided by the customer.
system requirements
—A condition or capability that must be met or possessed by a system or subsystem component to satisfy a condition or capability needed by a user to solve a problem.
task
“The smallest unit of work subject to management accountability. A task is a well-defined work assignment for one or more project members. The specification of work to be accomplished in completing a task is documented in a work package. Related tasks are usually grouped to form activities.” [IEEE87]
techniques
“Technical and managerial procedures that aid in the evaluation and improvement of the software development process.” [IEEE90]
work package
“A specification for the work to be accomplished in completing an activity or task. A work package defines the work product(s), the staffing requirements, the expected duration, the resources to be used, the acceptance criteria for the work products, the name of the responsible individual, and any special considerations for the work.” [IEEE87]
work product
—Any tangible item that results from a project function, activity, or task. Examples of work products include customer requirements, project plan, design documents, source and object code, user’s manuals.
5
20/06/2007
BugRep Software Test Plan
3. Referenced Documents BugRep - SRS
BugRep_SRS_ver_00_05.doc Software Requirement specification document of BugRep project, 18 January 2007
BugRep - SDD
BugRep_SDD_ver_00_02.doc Software Design document of BugRep project, 12 May 2007
SWEBOK
Guide to the Software Engineering Body of Knowledge, Chapter 5 2004 version
IEEE87
IEEE Std 1058-1.1987, Standard for Software Project Management Plans.
IEEE90
IEEE Std 610.12-1990, Standard Glossary of Software Engineering Terminology (ANSI).
IEEE98
IEEE Std 829, Standards for Software Documentation.
IEEE91
IEEE Std 610-1991, Computer Dictionary, Compilation of IEEE Standard Computer Glossaries.
Heritage85
The American Heritage Dictionary, Houghton Mifflin Publishers, 1985.
Paulk93
SEI Capability Maturity Model, Version 1.1, CMU/SEI-93-TR-24.
MIL-STD-498
Military Standard, Software Development and Documentation, 5 Dec 1994.
SOW
Statement of Work, 9 Sep 1994, Task Order 068-4-38, Contract No. F0863593-C-0068.
SOW-Amend1
Statement of Work, 8 Nov 1994, Amendment #1 to Task Order 068-4-38 dated 9 Sep 1994
SOW-Amend2
Statement of Work, 30 Mar 95, Amendment #2 to Task Order 068-4-38 dated 9 Sep 1994
SOW-Amend3
Statement of Work, 14 Apr 1995, Amendment #3 to Task Order 068-4-38 dated 9 Sep 1994
SOW-Amend4
Statement of Work, 8 Nov 1995, Amendment #4 to Task Order 068-4-38 dated 9 Sep 1994
4. Software Test Environment 6
20/06/2007
BugRep Software Test Plan
4.1 Development Test and Evaluation Qualification, integration, and module level tests are to be performed at home in each manager of the project. 4.1.1 Software Items Software used in the testing of BugRep: The software was developed using the NetBeans 5.5 IDE and requires the following software: - JDK 1.5 – Including support for Java EE 5 (Enterprise Edition version 5) - Application Server that supports JDK1.5 and EJB 3.0 – for example Sun Java System Application Server - Database Connection – For example the inner database that is used in NetBeans 5.5 Apache Derby DB For details of client and server software specifications see the BugRep Software Requirements Specifications (SRS). 4.1.2 Hardware and Firmware Items No special hardware. 4.1.3 Other Materials None 4.1.4 Proprietary nature, acquirer’s rights, and licensing The software used in this project is Open Source; the usage is bounded to the vendor's license agreement. For example, Sun's JDK license can be found here http://java.com/en/download/license.jsp. 4.1.5 Installation, testing, and control NetBeans IDE must be installed on the machine in order to use BugRep. For download and installation instructions please visit http://www.netbeans.org/
Installation Instructions 1.
Download the installation package from
2.
Unzip the file to a desired location (e.g. c:\bugrep)
3.
Select File --> Open and Open the BugRepApp-war project
4.
Right-click the project and select Resolve the association to the EJB project
7
20/06/2007
BugRep Software Test Plan 5.
Right-Click the War project and Select 'Run Project'
6.
A Web Browser is automatically opened on the relevant URL: http://localhost:8080/BugRepApp-war/faces/login.jsp
7.
Enter username and password (e.g. user 'iritl' and password '123456')
4.2 Test Site(s) The software items required for the test site are similar to the ones listed in the development environment (4.1).
5. Test Identification 5.1 General Information In the second Iteration the Project Management and User Management will be tested. 5.1.1 Test Levels Tests are to be performed at the module, installation, and qualification levels prior to release for beta testing. 5.1.2 Test Classes 5.1.2.1 Check for correct handling of erroneous inputs Test Objective - Check for proper handling of erroneous inputs for the User Management Module and Project Management: characters that are not valid for this field, too many characters, not enough characters, value too large, value too small, all selections for a selection list, no selections, all mouse buttons clicked or double clicked all over the client area of the item with focus. Validation Methods Used - Test Recorded Data - User action or data entered, screen/view/dialog/control with focus, resulting action Data Analysis - Was resulting action within general fault handling defined capabilities in the BugRep SRS and design in BugRep SDD. Assumptions and Constraints - None 5.1.2.2 Check for maximum capacity
8
20/06/2007
BugRep Software Test Plan Test Objective - Check software and database maximum capacities for data: enter data until maximum number of records specified in the design is reached for each table, operate program and add one more record. Validation Methods Used - Test Recorded Data - Record number of records in each table, resulting actions Data Analysis - Was resulting action to the maximum plus one normal. Assumptions and Constraints - This test requires someone to create through some method a populated database with records several times more than what actually exists in the sample data set. This takes a good deal of time. Integration and qualification test only. 5.1.2.3 User interaction behavior consistency Test Objective - Is the interaction behavior of the user interface consistent across the application or module under test: tab through controls, using mouse click and double click on all controls and in null area, maximize, minimize, normalize, switch focus to another application and then back, update data on one view that is included on another and check to see if the other view is updated when data is saved on the first view, use function keys and movement keys and other standard key combinations (clipboard combos, control key windows and program defined sets, Alt key defined sets), enter invalid control and Alt key sets to check for proper handling. Validation Methods Used - Test, Inspection Recorded Data - Record any anomalies of action resulting from user action not conforming to the behavioral standards for windows programs. Data Analysis - Was resulting action within behavioral standards of windows programs as defined in the BugRep SRS and design in BugRep SDD. Was behavior consistent across the application or module as defined in the BugRep SRS and design in BugRep SDD. Assumptions and Constraints-If testing at module level the multiple view portion of the test may not apply due to having only a single view. 5.1.2.4 Retrieving data Test Objective - Is the data retrieved correct: for each dialog, list box, combo box and other controls which show lists check the data displayed for correctness.
Validation Methods Used - Test, Inspection Recorded Data - Record data displayed and data sources (records from tables, resource strings, code sections). 9
20/06/2007
BugRep Software Test Plan
Data Analysis - Was data displayed correctly. Compare data displayed with sources. Assumptions and Constraints - Requires alternate commercial database software to get records from the database. 5.1.2.5 Saving data Test Objective - Is the data entered saved to the database correctly: for each dialog, list box, combo box and other controls which show lists check the data entered and saved for correctness in the database. Validation Methods Used - Test, Inspection Recorded Data - Record data entered and data destinations (records from tables). Data Analysis - Was data saved correctly. Compare data entered with destination. Assumptions and Constraints - Requires alternate commercial database software to get records from the database. 5.1.2.6 Display screen and printing format consistency Test Objective - Is user interface screens organized and labeled consistently, are printouts formatted as specified: enter data to maximum length of field in a printout and then print, show all screens (views, dialogs, print previews, OLE devices) and dump their image to paper. Validation Methods Used - Inspection Recorded Data - Screen dumps and printouts. Data Analysis - Was the printout format correct. Were the fields with max length data not clipped. Were the labels and organization of screens consistent across the application or module as defined in the BugRep SDD. Assumptions and Constraints - The module that performs forms printing is required with all other modules during their testing. 5.1.2.7 Check interactions between modules Test Objective - Check the interactions between modules: enter data and save it in one module and switch to another module that uses that data to check for latest data entered, switch back and forth between all of the modules will manipulating data and check for adverse results or program faults. Validation Methods Used - Demonstration
10
20/06/2007
BugRep Software Test Plan Recorded Data - Screen dumps. Data Analysis - Was resulting actions within specifications as defined in the BugRep SRS and design in BugRep SDD. Assumptions and Constraints - Requires customer participation. Requires all modules and supporting software. 5.1.2.8 Measure time of reaction to user input Test Objective - Check average response time to user input action: clock time from ( saves, retrieves, dialogs open and closes, views open and closes), clock time from any response to user action that takes longer than 2 seconds. Validation Methods Used - Test, Analysis Recorded Data - Record action and response clock time. Organize into categories and average their values. Are all average values less than the minimum response time specified. Data Analysis - . Organize into categories and average their values. Are all average values less than the minimum response time specified as defined in the BugRep SRS and design in BugRep SDD. Assumptions and Constraints - None 5.1.2.9 Functional Flow Test Objective - Exercise all menu, buttons, hotspots, etc. that cause a new display (view, dialog, OLE link) to occur. Validation Methods Used - Demonstration Recorded Data - Screen Dumps Data Analysis - Was resulting actions with specifications as defined in the BugRep SRS and design in BugRep SDD. Assumptions and Constraints - Requires customer participation. Requires all modules and supporting software. 5.1.3 Test Progression This section provides a description of the progression of testing. A diagram is often helpful when attempting to describe the progression of testing.
11
20/06/2007
BugRep Software Test Plan The Qualification Testing is a qualification level test verifying that all requirements have been met. The module and integration tests are performed as part of the Implementation phase as elements and modules are completed. All module and integration tests must be passed before performing Qualification Testing. All module tests must be passed before performing associated integration level tests. The CER tracking system is used to determine eligibility for testing at a level. The testing for the second iteration would be integration level. Software Test Progression Software Requirements Specifications Integration Requirements
Module Requirements
Qualification Testing (Test Summary Report (TSR)) (CER) Fixed
Integration Tests (Generated TIR’s)
(CER) Fixed
(CER) Working (CER) Working
Module Tests (Generated TIR’s)
(CER) Working
(CER) Testing (CER) Working
Implementation ( Code ) of Modules
]
5.2 Planned Testing This section provides a detailed description of the type of testing to be employed. 5.2.1 Qualification Test All of the requirement test items (refer to the Software Requirements Traceability Matrix) are to be tested as qualification level tests. The resulting output of qualification test is the BugRep Software Test Report (STR). If qualification test is passed and its results accepted by the project managers the BugRep software will be ready for beta release.
For the qualification level tests the following classes of tests will be used: Check for correct handling of erroneous inputs Check for maximum capacity User interaction behavior consistency Retrieving data
12
20/06/2007
BugRep Software Test Plan Saving data Check interactions between modules Measure time of reaction to user input Functional Flow
5.2.2 Module Test All of the modules to be tested (refer to the Software Requirements Traceability Matrix) are to be tested using defined module level test methodology. When all of the module tests for a module are passed the module is ready for integration level testing. We are going to test 3 modules: defect, user and project. For the module level tests the following classes of tests will be used: Check for correct handling of erroneous inputs Check for maximum capacity User interaction behavior consistency Retrieving data Saving data Measure time of reaction to user input 5.2.3 Integration Test All of the modules to be integration tested (refer to Software Requirements Traceability Matrix) will be tested using integration level test methodology. We are going to test the integration between 3 modules: defect, user and project. For the integration level tests the following classes of tests will be used: User interaction behavior consistency Check interactions between modules Measure time of reaction to user input
5.2.4 Installation Beta Test Following qualification testing the customer will review results. Signature of acceptance initiates product delivery and start of Installation beta test. Identified tests (refer to the Software Requirements Traceability Matrix) will be tested using defined installation level
13
20/06/2007
BugRep Software Test Plan Beta Test procedures. Following Installation Testing the customer will review results. Signature of acceptance completes the BugRep Version 2.0 project. For the installation level tests the following classes of tests will be used: User interaction behavior consistency Retrieving data Saving data Check interactions between modules Measure time of reaction to user input Functional Flow
6. Test Schedules The test should start on 12/06/2007 and will be finish till 16/06/2007.
7. Risk Management o The development won't finish on time and the test schedules will be delayed. o Problems in installation the system at each member of the group site.
8. Requirements Traceability Refer to BugRep Software Requirements Specification, Appendix A for information regarding requirements traceability.
9. Notes None
14
20/06/2007
BugRep Software Test Plan
Appendix A // BugRep Software Test Requirements Matrix דרישה יצירת משתמש קריאת משתמש עדכון משתמש מחיקת משתמש יצירת פרויקט קריאה של פרויקט עדכון פרויקט מחיקת פרויקט הקצאה של מנהל פרויקט קישור של מודול לפרויקט הקצאה של אחראי על מודול
SRS ref 2.2.1 2.2.1 2.2.1 2.2.1 2.2.1 2.2.1 2.2.1 2.2.1
עדיפות גבוהה גבוהה גבוהה גבוהה גבוהה גבוהה גבוהה גבוהה
סיכון נמוך נמוך נמוך נמוך נמוך נמוך נמוך נמוך
נבדק V V V N/A V V V V
2.2.3.1
גבוהה
נמוך
N/A
2.2.3.2
נמוכה
גבוה
N/A
2.2.3.3
נמוכה
גבוה
N/A
V – Valid – Test was performed and passed N/A – Not Available – Requirement is Out-Of-Scope N/A (yet) – Not Available yet – Requirement will be developed in a future release.
15 20/06/2007
ממצאים עובד עובד עובד עובד עובד עובד עובד
• • •