Test Plan Document INTERNATIONAL-KIDS.COM INTERNATIONAL-K IDS.COM DEVELOPMENT PROJECT
Prepared by Netizen Team Version 1.0 Created on : 10-Oct-2007 Last Modified : Document: Test Plan
Page 1 of 34
INTERNATIONALKIDS.COM Revision History Versio n/ Revisio n Numbe r
Author
1.0
Netizen
Description
Approve r
Effective Date
Initial Test Plan Draft
Page 2 of 34
INTERNATIONALKIDS.COM Table of Contents 1 INTR INTRODUC ODUCTION TION................ ................................... ..................................... .................................... .................................... .............................................................. ............................................ 4
1.1 1.1 1.2 1.2 1.3 1.3
Purpose of this Document......................................................................................................4 Purpose of this Document......................................................................................................4 Overview..................................................................................................................................4 Overview..................................................................................................................................4 Scope........................................................................................................................................5 Scope........................................................................................................................................5
1.3.1 Testing Testing Phases.......... Phases............................ .................................... .................................... ..................................... ..................................... ............................................5 ..........................5 1.3.2 Testing Testing Types................. Types.................................... ..................................... .................................... .................................... .................................... .......................................5 .....................5
1.4 Not in Scope............................................................................................... ...........................7 .......................... .7 1.4 Not in Scope............................................................................................... ...........................7 .......................... .7 1.5 Reference Documents.............................................................................................................7 1.5 Reference Documents.............................................................................................................7 1.6 Definitions Definitions and Acronyms....................................................... Acronyms..................................................................................................... .............................................. .7 1.6 Definitions Definitions and Acronyms....................................................... Acronyms..................................................................................................... .............................................. .7 1.17 Assumptions and Dependencies..........................................................................................7 1.17 Assumptions and Dependencies..........................................................................................7 1 TEST REQUIREMENT........ ................. .................. .................. .................. .................. .................. .................. .................. .................. .................. .................. .................. .............. ........ ... 8
2.1 Features Features to be Tested............................................................................................... ........8 2.1 Features Features to be Tested............................................................................................... ........8 Milestones (Schedule)..................................................................................................................9 Milestones (Schedule)..................................................................................................................9 2 TESTING ENVIRONMENT......... .................. .................. .................. .................. .................. .................. ................. ................. .................. .................. ............. ........ ......... ........ ... 10
1.18 1.18 1.19 1.19
Browsers.............................................................................................................................10 Browsers.............................................................................................................................10 Hardware Hardware and Software Requirements Requirements ................................................... ....................... ................ .......10 10 Hardware Hardware and Software Requirements Requirements ................................................... ....................... ................ .......10 10
3.2.1 Offshore Offshore................ .................................. ..................................... ..................................... .................................... .................................... ..............................................10 ............................10 1.1.1.1 Development/ Development/ Development Integration Environment:......... .................. .................. .................. ................ ........... ......... .......... .....10 10 1.1.1.2 QA......... .................. .................. .................. .................. .................. .................. .................. .................. .................. .................. .................. .................. .................. ................. ........... ... 11
1.20 Human Resources...............................................................................................................12 1.20 Human Resources...............................................................................................................12 3 ROLES AND RESPONSIBILITIES........ ................. .................. .................. .................. .................. .................. .................. .................. .............. .......... .......... .......... ....... 12 .. 4 TEST STRAT STRATEGY EGY....................... .......................................... ..................................... .................................... .................................... .................................................... .................................. 13
Test Process Workflow..............................................................................................................13 Test Process Workflow..............................................................................................................13 1.21 Test Organize/Review Organize/Review Project Documentation........................................................ Documentation................................................................ ........14 14 1.21 Test Organize/Review Organize/Review Project Documentation........................................................ Documentation................................................................ ........14 14 1.22 Develop System Test Plan..................................................................................................15
Page 3 of 34
INTERNATIONALKIDS.COM 1.22 Develop System Test Plan..................................................................................................15 1.23 Test Design/Development...................................................................................................15 1.23 Test Design/Development...................................................................................................15 1.24 Unit Test Execution............................................................................................................16 1.24 Unit Test Execution............................................................................................................16 1.25 Integration/System Test Execution ..................................................................................17 1.25 Integration/System Test Execution ..................................................................................17 1.25.1Integration Testing......................................................................................................................17 1.25.2Syst 1.25.2System em Testing..... Testing....................... ..................................... ..................................... .................................... .................................... ........................................ ...................... ......17 5.6.3 Testing Testing Types ................................... ..................................................... ..................................... ..................................... .................................... ................................... ................. .18 .18 5.6.4 Test Execution Execution workflow .................................... ....................................................... ..................................... .................................... ...................................22 .................22
1.26 Defect Tracking and Management Management ............................................................................. ......26 1.26 Defect Tracking and Management Management ............................................................................. ......26 1.27 Update Documents Documents and Results.................................................................. Results.................................................................. .......................31 .................... ...31 1.27 Update Documents Documents and Results.................................................................. Results.................................................................. .......................31 .................... ...31 1.28 Test Reports .......................................................................................................................31 1.28 Test Reports .......................................................................................................................31 1.29 UAT and Closure ...............................................................................................................32 1.29 UAT and Closure ...............................................................................................................32 5 CONFIGURA CONFIGURATION TION MANAGEMENT......... .................. .................. .................. ................. ................. .................. .................. .................. ................ ............ .......... ...... 33 . 6 DELIV DELIVERABL ERABLES ES................ ................................... ..................................... .................................... .................................... ..................................... ........................................... ........................ 33
1 1.1 1.1
Introduction Purp Purpos ose e of of tthi his s Doc Docum umen entt
The purpose of this document is to outline the Test Strategy/Approach and the Quality Assurance process for the International-kids.com. This This docume document nt will will establi establish sh the Syst Sy ste em test test plan plan for for the the International-kids.com application. application. It wil will all allow the the develo developmen pmentt team, team, business business analys analysts, ts, and projec projectt manage managemen mentt to coordi coordinate nate their their efforts efforts and efficiently efficiently manage the testing testing of the site. The QA process process outlined in this this Syste System m Test Test Plan Plan will will ensu ensure re that that a qu qual alit ity y International-kids.com applicatio application n is deployed successfully and on schedule. The intended audiences for this document are all stakeholders of the Internationalkids.com project.
1.2
Overview
Page 4 of 34
INTERNATIONALKIDS.COM The current International-kids.com is Windows XP based, compatible with Office2002 and and writ writte ten n in PHP, PHP, usin using g MYSQ MYSQL L 5.0 5.0 Serv Server er data databa base se.. International-kids.com expectation with the new application is twofold: 1. Front Office functionalities and 2. Back Office functionalities The focus is primarily on successful migration and implementation of the application. The main objective of this Test plan is to define the methodology to test Internationalkids.com application to check and ensure that • New system preserves all of its current business functionalities. • The enhancements have been implemented to the new system. Newe Newerr enh nha ancem nceme ents nts do not adve advers rsel ely y affe affect ct the the curr urrent ent bu busi sine ness ss • functionalities. • The The syste system m has has flex flexib ibil ilit ity y /cap /capac acit ity y to deal deal with with compl complex ex Internationalkids.com structure and programs, as it continues to change
1.3
1.3. 1.3.1 1
Scope
Test Testiing Phas Phases es
The following table lists the various phases of International-kids.com application testing and the team responsible for it. Phases Unit Testing Integration Testing System Testing User Acceptance Testing
1.3.2
Teams Responsible
Development team Testing team Testing team International-kids.com User Representatives
Testing Ty Types
International-kids.com application will undergo the following types of testing. All types of testing are explained in detail under Test Strategy section Activity Functionality Testing
Database Testing
Teams Responsible Performed by testing team during Integration/System testing phase to meet agreed upon functional requirements of International-kids.com application. Follo ollowi wing ng Fun unct ctiional onal area areas s will will be pu putt to test test:: (1) (1) Application Submission (2) Peer Review. All the features put under test are mentioned in brief under “Features to be tested” section in this plan. All the features put under test will be described described in detail in Test Test Scenari Scenario o docume documents nts and Test Test case case documen documents. ts. On completion completion of every single functional functional area, test scenario scenario and test case documents will be delivered. Please refer to Deliverables section mentioned below. Performed by testing team during Integration/System testing phase to qualify database which houses the content that the International-kids.com application manages, run
Page 5 of 34
INTERNATIONALKIDS.COM Security Testing GUI GUI and and Usab Usabil ilit ity y Test Testin ing g Performance and Load /Volume Testing
Code Testing Smoke Testing
Regression Testing
Defect fix verification testing/Defect validation testing) Comp ompatibility Testi sting
Interface Testing
•
Sign on
Adhoc Testing
queries and fulfill user requests for data storage Database migration testing will be taken care by DBA’s Performed by testing team during Integration/System testing phase to meet agreed upon Security requirements of International-kids.com application Perf Perfor orme med d by test testin ing g team team du duri ring ng Inte Integr grat atio ion/ n/Sy Syst stem em testing phase Performed by testing team during System Testing phase. Automa Automatio tion n testin testing g will will be perfor performed med to carry carry out these these types of testing. tool will be used to perform these tests. Various Reports that are part of International-kids.com Application will be one of the main areas while performing load/volume testing Performance test methodology. methodology. Performed by by development te team du during Un Unit Te Testing ph phase at every method level. Performed by development team during Unit Testing phase for qualifying the build for releasing it to Testing team. Perf Perfor orme med d by Testi Testing ng team team du duri ring ng Inte Integr grati ation on/Sy /Syste stem m phase for qualifying the build for further tests. Performed by testing team during Integration/System testing phase for re-testing an entire or partial system after a modification has been made to ensure that no unwanted changes were introduced to the system. Perf Perfor orme med d by testi testing ng team team du duri ring ng Inte Integr grat atio ion/ n/Sy Syste stem m testing phase for verifying the defect fixes Performed by testin ting team during Inte ntegration/Syste stem testing phase to test the compatibility with respect to base configuration (a) Browser Browser (IE 6.0), O.S. O.S. (Win XP) XP) (b) Mozilla Mozilla fire fox( fox( ), O.S (Win (Win XP) (c) Opera ( ), O.S (Win (Win XP) XP) Cert Certif ific icat atio ion n test testin ing g will will be perf perfor orme med d on foll follow owin ing g combinations : Browser (IE 7.0), O.S (Win XP) Performed by Development team during Unit testing phase Perf Perfor orme med d by testi testing ng team team du duri ring ng Inte Integr grat atio ion/ n/Sy Syste stem m testing phase and Performed by International-kids.com team during UAT phase in order to have a complete test. PA test testin ing g team team will will be resp respon onsi sibl ble e for for test testin ing g this this functionalit functionality y by accessing accessing International-Kids.com -QA environment Performed by Testing team during Integration/System testing phase to test the (1) Navigations that are unusual
Page 6 of 34
INTERNATIONALKIDS.COM and and (2) (2) Nega Negati tive ve components.
1.4
scen scenar ario ios s
with withiin
and and
acr across oss
the the
Not in Scope 1 2
3
1.5 1.5
Stre tress Testi sting Cras Crash/ h/Re Reco cove very ry Test Testin ing g When the scope of the new application has been agreed and signed off, no further inclusions will be considered for inclusion in this release, except: Where Where there there is the express express permis permissio sion n and agreem agreement ent of the Business Business • Analyst, Project Manager and the Client; Where the changes/inclusions will not require significant effort on behalf of • the test team (i.e. requiring extra preparation - new test conditions etc.) and will not adversely affect the test schedule.
Refe Refere renc nce e Docu Docum ments ents
# 1. 2.
Reference Document Name International-Kids.com Software Requirements Document International-Kids.com Test plan
1.6 1.6
Defi Defini niti tion ons s and and Acro Acrony nyms ms Acronym
QA SRD PM PL TL InternationalKids.com
1.17 • • •
•
•
•
•
•
Description
Quality Assurance Software Requirement Document Project Manager Project Lead Technical Lead International-Kids.com
Assumptions and Dependencies Build will be released on time for testing as per the plan. All “Show-Stopper” bugs receive immediate attention from the development team. Testing of all available features in International-Kids.com application application will be done using data dump provided by DBA. Enhancements will be incorporated into the original design of the InternationalKids.com application. Some enhancements will require further analysis. This analysis will be worked out during construction phase. All bugs that are prioritized for next version will be unit tested and fixed by the development team before the next version can be released. Functionality of the system will be delivered as per schedule and specifications for the testing team during each phase Required Required resources resources will be available. available. Project Manager will ensure ensure availability availability of environment.
Page 7 of 34
INTERNATIONALKIDS.COM •
1
Once the PA code enters DBA’s development environment, all bugs will be tracked using Bugzilla which is DBA’s Bug tracking tool. All bugs will be tracked under RIS project in Bugzilla.
Test Re uirement 2.1
Features to be Tested
1. General requirements
•
•
2. Roles 3. Site entry/exit 4.Application Submission
5. Peer Review
6. General Administration 7. Email Email Funct Function ions s
• •
General requirements for Landing pages of various Roles/Users (International-Kids.com landing page requirements) Dashboard functionality Functionality/ Role of each User in particular External/Internal user (Login/Logout)
Registration (Update/view profile page) Applicant level validation • Update/View/Re-submit • Acceptance / Rejection of Application by Moderator. • • Approval of Application by International-Kids.com -Admin Staff Assignment to Committees • Identification of Primary and Secondary Conflicts • Assignment to Reviewers • Dual Application • Brokering list • Reviewer/s • Access to submitted applications • Preferences • Add/update Preliminary Scoring/Triage/Critiques/Concerns • Meeting • Critique Editing by (Reviewers and Admin) • Concern Generation/Resolution • Generation of Committee Scores/Final Scores/Median/Variance • Average Merit & Percentile Score generation • NEW/Existing users Maintenance/Management • Access control/Roles and Responsibilities(Security • maintenance) ( Initia Initiall Notific Notificati ations, ons, , Maintai Maintain n Message Message Text Text)) •
All the features to be tested will be detailed in respective Test Scenario documents and Test case documents based on test types mentioned in Scope section. All the Test Test Scenar Scenario io docume documents nts will will be delive delivered red for review review during during Pre-co Pre-const nstruc ructio tion n phase and the Test case documents will be delivered in the middle of Construction phase just just before before integr integrati ation on test test begins begins.. Please Please refer refer to “Deliv “Delivera erable bles” s” sectio section n below below for deliverable dates.
Page 8 of 34
INTERNATIONALKIDS.COM
Milestones (Schedule) NOTE: NOTE: Follow Following ing dates dates are projec projected ted with with the assump assumptio tion n of begin beginnin ning g the th Construction phase on 10 Oct 2007. Actual dates will be modified as per the Project plan once the construction phase begins. Task Name
Ramp up Understand requirements and review docs. Test Plan Development Test Plan Review/Updation Review/Updation Sign off on Test plan Test Scenario Development Development FunctionalArea1: Application Submission Review and updation FunctionalArea2: Peer Review Review and updation Non functionality requirements (Performance, Security, e.t.c) Review and updation Test Case Development Development FunctionalArea1: Application Submission Review and updation FunctionalArea2: Peer Review Review and updation Non functionality requirements (Performance, Security, e.t.c) Review and Updation Establish Testing Environment Smoke Testing Functionality Testing (Includes Functionality, Gui, Usability, Security and Database testing) Regression Testing (Includes Regression, Adhoc and Defect fix verification testing) Integration Testing (Includes External Interfaces and Module Interfaces testing) Access to International-Kids.com QA Environment System Testing (Includes all types of testing mentioned above and the test types mentioned below) Compatibility Testing
Durati on
Start Date
End Date
10-Oct-2006 21-Oct-2006
11-Oct-2007 12-Oct-2007 15-Oct-2007 16-Oct-2007 17-Oct-2007 18-Oct-2007 11-Oct-2007 12-Oct-2007 15-Oct-2007 16-Oct-2007 17-Oct-2007 18-Oct-2007 19-Oct-2007 22-Oct-2007 23/24/25-Oct2007 26-Oct-2007 27/29-Oct2007
30,31-Oct2007, 1-Nov2007
Page 9 of 34
INTERNATIONALKIDS.COM Performance and Load/Volume Testing User Acceptance Testing
2
Testin sting g En Environ ironme men nt
1.18 Browsers Browser
Internet Explorer 6.0 Internet Explorer 7.0 Mozilla Fire Fox Opera
Execution
Environment
√
Win XP
Cer Certifi tifica cati tion on
Win Win XP
Certification Certification
Win XP Win XP
Execution of Test cases:
“√” Symbol mentioned above refers to - “ Entire test cases will be executed”. The text , “Certific “Certification” ation” mentioned mentioned above refers refers to - “ Selected test cases will be executed to verify the capability of the application application on these browsers ”.
1.19 Hardware and Software Requirements 3.2.1
Offshore
This section describes the environment setup offshore that is used in the development and testing of the application. 1.1.1. 1.1.1.1 1
Develo Developme pment/ nt/ Devel Developm opment ent Integ Integrati ration on Envir Environm onment ent::
The Development Environment offshore corresponds to the environment used by the developers during the construction. Unit Testing on International-kids.com version is perform performed ed on this this enviro environme nment. nt. Each Each develo developer per machin machine e will will have have Internationalkids.com running on Apache Web server. There will be a common MYSQL development database server and all the developers will be using the same database server. SOFTWARE Type
Name
Web Server
Apache
Front End Designing Tool
PHP(Personal Home Page to Hypertext Preprocessor)
Version
Apache2. 0 PHP 4.0/5.0
OS
Windows XP Windows XP
Page 10 of 34
INTERNATIONALKIDS.COM SOFTWARE Type
Name
Scripting Language
Java script and Ajax
Database
MYSQL
Browser
IE
Version
Java Script Ajax MYSQL 5.0 6.0
OS
Windows XP Windows 2000 Server Windows 2000 Professional Windows XP
HARDWARE Machine type
HDD
RAM
Web server
40 GB
1 GB
Database Server(MYSQL)
80 GB
1 GB
1.1.1.2
CPU
Intel Pentium 4, 2.66GHz Intel Pentium 4, 2.8 GHz
QA
The QA Environment offshore corresponds to the environment on which Integration Testing is performed for International-kids.com version. Offshore QA SOFTWARE Type Web server Scripting Language Database Browser (Base) Browser (Certification)
Name
Version
Apache Java Script, Ajax
Apache 2.0 Java script Ajax MYSQL5.0 6.0 7.0 6.0/7.0 4.0/5.0 9.22
MYSQL IE IE Mozilla Fire Fox Opera
OS
Windows XP Windows XP Windows 2000 Server Windows XP Windows XP Windows XP Windows XP Windows XP
HARDWARE Machine Type
HDD
RAM
QA web server
40 GB
1 GB
QA Database Server(MYSQL) Test 1 (Desktop
280 GB 40 GB
1 GB 1 GB
CPU
OS
Intel Pentium 4, 2.8 GHz Intel Pentium 4, 2.8 GHz Intel Pentium
Windows XP Professional Windows 2000 Server Windows XP
Browser
IE6.0
Page 11 of 34
INTERNATIONALKIDS.COM class) / Base Test 2 (Desktop class) / Certification Bugzilla Server
80 GB
1 GB
40 GB
1 GB
4, 2.4 GHz
Professional
Intel Pentium 4, 2.4 GHz Intel Pentium 4, 2.66 GHz
Windows XP Professional Windows XP Professional
IE 7.0
1.20Human Resources
Resource Title
QA lead Test Engineer (For performing Testing of typesFunctionality, Gui/Usability, Database, Smoke, Regression) Test Engineer (For performing Testing of typesPerformance, Load/Volume, Compatibility, Security, Adhoc)
3
Number
Date Required
Resource name
1 2
1
Role Roles s And And Resp Respon onsi sibi bili liti ties es
Role
QA lead (Team member)
Responsibility
•
•
• •
•
•
•
Tester 1/2/3
•
Preparing/Updating the Test plan Preparing/Updating the Test Scenarios Reviewing the Test cases Building/deploying the application in QA/System integration environment Preparation of Traceability matrix Daily Test plan preparation*** (Please refer below for details) Generating Test summery report Preparing/updating Test cases
Work Ph, Mobile, Email id
Name
Secondary Role Project Lead / Project Manager
QA lead
Page 12 of 34
INTERNATIONALKIDS.COM (Team member)
• •
•
•
•
•
•
Reviewing the test cases Executing test cases in Integration / System environment Recording test results in Integration/System environment Impact analysis for failed test cases Logging/verifying/closing and tracking defects Raising issues/clarifications in Issue tracker/clarification register on Bugzilla. Perform various types of testing like Functionality, Smoke, Regression, Adhoc, Security, GUI/Usability, Volume, Compatibility, Performance/Load, Database.
Daily Test plan preparation*** preparation***
The QA lead is responsible for preparing the daily test plan that shall include the following: 1. Alloca Allocate te the the worklo workload ad for each each tester tester 2. Plan to ensure ensure that that the tests tests being being performed performed will will cover all all required required functionality for the required OS and Browser. 3. Create and and maintain maintain a list that defines defines the range range of scripts/ scripts/test test cases cases to be completed on specific days 4. Distribute/ Distribute/communi communicate cate this list to the the testers testers 5. Distribute Distribute the Daily Daily test plan plan to the Projec Projectt Manager Manager
4
Test St Strategy
The Test Strategy presents the recommended approach to the testing of the International-kids.com Development Project. The previous section on Test Requirements described what would be tested; this describes how it will be tested.
Test Process Workflow
Page 13 of 34
INTERNATIONALKIDS.COM
The above diagram explains the complete QA process/Test life Cycle in General. Following are the steps which would explain in detail , the Test Strategy to be followed for International-kids.com Application. Step1. Test Organize/Review Project Documentation Step2. Test plan Step3. Test Design/Development Step4. Unit Test Execution Step5. Integration/System Test Execution Step6. Defect Tracking and Management Step7. Update Documents and Results Step8. Test Reports Step9. UAT and Closure
1.21Test Organize/Review Project Documentation Documentation reviews provide a means for testing the accuracy and completeness of the planning, requirements and specifications. Throughout the project, periodic reviews will be held to assure the quality of project documentation. These reviews will:
Page 14 of 34
INTERNATIONALKIDS.COM •
Ensure Ensure project project plans plans have have adequa adequate te time time alloca allocated ted for testin testing g activi activitie ties s and determine limitations.
•
Ensure that the Business Requirements, Information Site Flow, Use Cases, Business Rules, and Technical Design documents clearly articulate the functionality of the International-kids.com. -kids.com.
1.22Develop System Test Plan Thi This s step step of the the testi testing ng proc proces ess s invo involv lves es crea creati tion on of the the Syste System m Test Test Plan Plan (thi (this s docu docume ment nt). ). This This will will serve serve as the the gu guid idep epost ost for deve develo lopm pment ent of test test case cases s and for for integration of testing with other project activities. •
•
This plan describes at a high level the overall testing plan and strategy for the International-kids.com Application. Professional Access will follow this plan to develop test scenarios/cases and scripts that will be used for system testing.
•
Test scenarios will be described in in separate document(s).
•
Test Cases will be described in separate document(s)
•
Profess Profession ional al Access Access will will obtain obtain test test accoun accounts ts and Ids for Interf Interface ace testin testing g (see (see Scope).
1.23Test Design/Development
Brief explanation explanation of Test Design/Develo Design/Development pment workflow with respect respect to the Process flow diagram displayed above:
Page 15 of 34
INTERNATIONALKIDS.COM T1,T2 T1,T2
Test Test lead take takes s part in in prepara preparatio tion n of Elabora Elaboratio tion n phase deli deliver verabl ables es like like Test Test plan and updates the Artifacts in CVS for further reference
T3, T3, T4
From From Post elaborat elaboration ion phase phase to Pre-co Pre-constr nstruct uction ion phase, phase, Test Test Scenario Scenarios s would be designed by the Test lead for the modules/features available in SRD. When once final draft version of SRD with all modules/features specification/s is/are received, Test scenarios are designed and completed during Pre-construction phase. All the created/updated Test scenarios are hoarded in CVS T5 Test lead assigns the task of test case/test script creation to test team members during the Construction phase phase T6, T6, T7 For For all all the the Test Test scen scenar ario ios s crea create ted d earl earlie ierr du duri ring ng Post Post-e -ela labo bora rati tion on/P /Pre re-cons constr truc ucti tion on ph phas ase, e, the the Test Test team team memb member ers s desi design gn test test case cases s du duri ring ng Construction Phase. All the created Test cases/Test scripts are hoarded in CVS T8, T8, T9 All the the created created Test Test cases/Te cases/Test st Script Scripts s are reviewe reviewed d by Test lead lead and all the review comments are updated in CVS T10 T10 Test Test team team memb member ers s will will chec check k the rev revie iew w comme comment nts s and upda update te res respe pect ctiv ive e test cases and hoard the same in CVS. T11 T11 Test Test Lead Lead will will map map the the requ requir irem emen ents ts to Test Test case cases s in Trac Tracea eabi bili lity ty matr matrix ix (The (The objective of this matrix is to illustrate how to document which test case/s test which functiona functionality lity of software and and which structural structural attribute. attribute. It maps maps test requirements to the test cases/Test Scenarios that implement them.) Written test cases and scripts will be used to direct system testing efforts. Professional Access test team will write these in accordance with the System Test Plan. •
Tests will be developed to exercise the required functionality for the website, validate data integrity, and ensure that data is passed or received successfully from external interfaces. Test Cases will be written in a separate document appended to this plan.
•
Each test case will document the steps or actions required to exercise a specified area of functionality. The test cases will be reviewed to verify that they properly validate the intended intended functionality. Actual testing will will be performed by executing executing the steps of the test case. A pass/fail notation will be made for each step.
•
Each test case will be executed manually and using automated testing tool(for Perf Perfor orman mance ce/L /Loa oad d testi testing) ng) using using the brows browser er vers versio ions ns ment mentio ione ned d in Test Test environment environment section. section. A pass/fa pass/fail il notatio notation n will will be record recorded ed for each each condit condition ion tested, noting the severity and reason reason for each instance instance of failure. Test scripts to perfor perform m Perform Performanc ance/L e/Load oad testin testing g will will be execut executed ed automati automatical cally ly during during the System testing phase.
1.24Unit Test Execution Unit Unit testi testing ng veri verifi fies es each each modul module, e, comp compone onent, nt, obje object ct,, or progr program am deve develo loped ped is functionally correct and conforms to requirements. requirements. A unit is defined as a single program function function in terms of inputs, processes processes and outputs. A program unit is small enough enough that the developer who developed it can test it in great detail.
Page 16 of 34
INTERNATIONALKIDS.COM The developer that wrote the code is responsible for creating, updating and executing the unit unit tests tests after after each each succ succes essf sful ul bu buil ild d in the the deve develo lopme pment nt envi enviro ronme nment nt.. Separ Separat ate e document has been prepared drafting Unit test strategy.
1.25Integration/System Test Execution 1.25.1 1.25.1 Integr Integrati ation on Test Testing ing
The objective of these tests is to ensure that all the components of the system function properly together and that the application interfaces properly with external applications. Entrance Criteria
All functions to be tested have successfully passed unit testing All Severity 1 and 2 defects are fixed and have successfully passed unit testing testing.. (See (See Defect Defect Manageme Management nt portion portion of this docume document nt for severi severity ty definitions) Software build is properly version controlled Build report has been completed and submitted with build All hardware and software configurations are in place and ready to test All test cases required for integration testing have been prepared All required integrated systems are available
Exit Criteria
All components delivered and tested function as detailed in the documents in the References portion of this document Test cases have been updated if and when functionality has changed Test results report is developed/updated All new defects have been logged into the issues tracking database
1.25 1.25.2 .2 Syst System em Testi Testing ng
The test team will conduct a system test to verify that the software matches the defined requirement requirements. s. Once the application application has executed executed successfully successfully under integration integration test, each test suite will be executed against the other supported configurations to ensure defects are not created created because the system configuration has changed. A separate test environment must be established for all hardware, software, and browser configurations supported Entrance Criteria
All functions tested have successfully passed integration testing All All sever severit ity y 1 and and 2 defe defect cts s are are fixe fixed d and and have have succ succes essf sful ully ly pass passed ed regression testing Test cases have been updated if and when functionality has changed
Page 17 of 34
INTERNATIONALKIDS.COM
All test cases required for system testing have been prepared All hardware and software configurations are in place and ready to test All required integrated systems are available
Exit Criteria
5.6.3
All Severity 1 and 2 defects are fixed and have successfully passed regression testing The The risks risks associa associated ted with with not correc correctin ting g any outstand outstanding ing Severi Severity ty 3and 3and 4 defects have been identified and signed off by the Project Manager, Technical Lead, QA Lead All components delivered and tested function as detailed in the documents in the References portion of this document Regression tests have been performed and executed successfully Test results report is developed/updated All new defects have been logged into the issues tracking database
Testing Ty Types
The scope of the work is to conduct testing in the following areas: ♦ ♦ ♦ ♦ ♦ ♦ ♦ ♦ ♦
♦
Functionality Database Smoke Security User Interface/Usability Compatibility Performance/Load/Volume Adhoc Regression
Functionality Testing
The objective objective of this test is to ensure that each element element of the application application meets the functional requirements of the business as outlined in the: Software Requirement Document/Use cases. Software Design Document. Other Functional documents produced during the course of the project i.e. resolution to issues/change requests/clarifications/feedback. Secondly, includes includes specific functional testing, which aims to test individual process process and data flows. This stage will also include include Validation Validation Testing, Testing, which is intensive testing of the new front-end fields and screens. Functionality testing will be performed on every build that is right from when Build series (2 weeks Test process cycle) commences till the final System-testing pass. In other words, Functionality testing will be performed by testing team just after the development of set of features as per the decision of Technical lead/Project manager, basic basical ally ly as part part of inte integr grati ation on testi testing ng.. This This proc process ess will will be cont contin inue ued d till till the the completion of System testing phase.
Page 18 of 34
INTERNATIONALKIDS.COM ♦
Database Testing
Database would be tested under following perspectives: Testing the database Schema (Stored procedures, triggers, views e.t.c) after migration (done by MYSQL DBA developer) Testi Testing ng the database database which which houses houses the conten contentt that that the International-kids.com application manages, runs queries and fulfills user requests for data storage (done by Testing team) Issues to test are: Data integrity errors (missing or wrong data in tables) Outp Output ut erro errors rs (err (error ors s in writ writin ing, g, edit editin ing g or read readin ing/ g/re retr trie ievi ving/ ng/qu quer eryi ying ng operations in the tables)
Database testing will be performed along with functionality testing on every build, right from the First Build series (2 weeks test process cycle) till the final Build series. ♦
User Interface/Usability Testing
The usability testing will be accomplished by verifying the information in each window is accurate. Menus, icons and toolbar toolbar functionali functionality ty will be tested tested as applicable applicable to the navigation and results panes. Importance will be given to graphics, contents, data presentation, feedback and error messages, design approach, user interface controls, formatt formatting ing,, instru instructi ctions ons e.t.c. e.t.c. Multi Multi Window Window Overla Overlappi pping ng will will be tested tested because because product supports opening of multiple documents. GUI/Usability testing will be performed along with functionality testing on every build, right from the First Build series (2 weeks test process cycle) till the final Build series.
♦
Adhoc Testing
Adhoc Testing is done on every build right from first Build series till the last Build series. This is mostly experience based testing and carried out from the application usage perspective. Just based on knowledge of functionality/ies the test team member will perform this test. He/she need not refer to any Test case/Scenario/Plan. User concentrates on navigations that are unusual, negative or across the components. During the second week of every Build series (2 week test process cycle) Adhoc testing will be performed. This test will be performed during Integration test phase as well as System test phase. ♦
Smoke Testing
During Integration testing, which is performed in parallel with development phase, every time before releasing the Build to QA team, Development team performs Smoke testing to check whether mentioned/planned set of features have been implemented without getting into details. When once the build is released to QA team, before accepting the Build for further testing process, Smoke testing is performed to check whether the application’s planned set of most crucial functionalities work, without bothering with finer details. It does mean that for a released build, availability of all the features as mentioned in Release notes will be tested.
Page 19 of 34
INTERNATIONALKIDS.COM When once the System passes smoke test, it would be subjected for further tests. Before commencing System testing too, QA team would perform smoke testing to check whether all functionalities are implemented into the System at high level. ♦
Compatibility Compatibility testing
Browsers: Compatibility matrix where different brands and versions of browsers are tested to a certain number of components and settings, for example Applets, client side scripting, ActiveX controls, HTML specifications, Graphics or Browser settings, has/have been mentioned in Section 3.2 • Settings, Preferences: Depending on settings and preferences of the client machine, web application may behave differently. Options such as screen resolution and color depth would be considered while testing. Printing: • Despite the paperless society the web was to introduce, printing is done more than ever. Testing would be performed to check whether the pages are printable with considerations on: Test and image alignment Colors of text, foreground and background Scalability to fit paper size, e.t.c Selected set of Usability/GUI test cases will be executed as a part of Compatibility testing during System testing phase. •
♦
Security testing
Security testis will determine how secure the new AHA-RSDP system is. The tests will verify that unauthorized user access to confidential data is prevented. This type of testing would be performed to check That for each known user type the appropriate function / data are available and all all trans transac acti tion ons s func functi tion on as expe expect cted ed and run run in prio priorr Appl Applic icat atio ion n Function tests Directory setup That without authorization, access permissions will not be provided to edit scripts on the server Time-out limit Bypassing login page by typing URL to a page inside directly in the browser, e.t.c ♦
Performance and Load/Volume testing Performance testing verifies the application’s response time under maximum load conditions. conditions. The purpose of performance performance testing testing is to measure the application application under load condition conditions. s. Subjecting Subjecting the the applicati application on system system to expecte expected d peak loads loads before before release can ensure software quality. Questions answered are:
(1) Do appl applic icat atio ions ns and and data databa bases ses perf perfor orm m cor corre rect ctly ly un unde derr load? (2) (2) What What res respo pons nse e time time can can be expe expect cted ed and and wil willl it it meet meet requirements? (3) (3)
What What ope opera rati tion ons s nega negati tive vely ly imp impac actt perf perfor orma manc nce? e?
Page 20 of 34
INTERNATIONALKIDS.COM Performance testing procedures
The general approach approach for load testing is to set up a test website configurati configuration on and to run selected selected test scripts scripts against it to measure performance. performance. The configuratio configuration n and test environment environment should mirror mirror the production production environment. environment. Individual Individual tests will be run to verify correct operation of the scripts. Then the scripts will be run again in several cycles. Each cycle will increase the number of concurrent users until the required system capacity has been successfully demonstrated. The The testi testing ng proc proces ess s is inher inheren entl tly y iter iterat ativ ive; e; sinc since e earl early y tests tests may may enco encount unter er bottlenecks or defects. The tests will need to be repeated after the system has been tuned tuned or reco reconf nfig igur ured ed or the the defe defect cts s have have been been corr correc ecte ted. d. In many many case cases, s, one one bottleneck may obscure the presence of another. Thus, when problems have once been corrected, it is possible (even likely) to encounter others on subsequent trials.
The goal of performance testing is to be able to: 1) Determine Determine if the customer customer will will experienc experience e unacceptable unacceptable response response time when the store website is under load. 2) Determine Determine if the web server, server, application application server server or or database server will crash under load. 3) Tune the the application application based on on performance performance issues found. Metrics that we will attempt to achieve include: The response time from the point when the web server receives a page request to the point when the web server serves the requested page is a metric used to test performance. This metric will be revisited once the pages have been built built to determine an acceptable response time. There will be separate metrics for the search results pages vs. the other pages. concurrent users * active users** * Concurrent users refers to users who are maintaining an active session with the site and and may may or may may not not be acti active vely ly clic clicki king ng on the the site site.. (Ple (Pleas ase e see see Tech Techni nica call Specification for details) ** Active users refers to those users who are actually clicking on the site at any given time. (Please see Technical Specification for details)
Areas of the website that we recommend load testing include : Site Area Concurrent Action Tested Load Area Home Page - Where multiple users Page loading/performance access the home page Registration - Where multiple users Updating the database Register using different usernames - Where multiple Accessing the database registered users login Search/Browse - Where multiple users Accessing the database Browse same category - Where multiple users Accessing the database Browse different categories
Page 21 of 34
INTERNATIONALKIDS.COM General site Navigation
♦
- Where multiple users Search various keywords - Where multiple users navigate general site functionality
Accessing the database Page loading/performance
Regression testing
A Regression test will be performed subsequent to the release of each Build from second release on wards to ensure that
5.6. 5.6.4 4
There is no impact on previously released software with the addition of new functionality, and To ensure that there is an increase in the functionality and stability of the software. There is no impact on previously released software with the resolution of defects.
Test Test Exe Execu cuti tion on wor workf kflo low w
Test Method:
The following activities will be performed during the test process: The development team will verify through unit testing that each module, component, object, and program is functionally correct and conforms to the use case definitions document. The test team will conduct a functionality/integration test of the larger system to ensure that all the functionalities/components of the system function properly together and that the application interfaces properly with external application/s.
The test team will conduct a system test to verify that the software matches the defined requirements. All the test cases/scripts executed during previous QA cycles will be re-executed to check the correctness of the system. Once the application has executed successfully under integration test, each test suite will be executed against the other supported configurations to ensure defect defects s are not created created because because the system system configur configurati ation on has change changed. d. A separate test environment must be established for all hardware, software, and browser configurations supported.
The test team recommends that performance testing be done using performance performance tool. tool. The purpose purpose of load testing is to ensure ensure stability stability of the applic applicati ation on under simulat simulated ed load conditio conditions. ns. Automa Automated ted performance testing tools can simulate the load on the system being tested, eliminating the necessity of employing hundreds of users,
Page 22 of 34
INTERNATIONALKIDS.COM huge volume of data, many transactions and obtaining the required equipment.
The test team will will conduc conductt the tests by executing the test cases and scripts. Each test case will test a specific area of functionality. Test cases will be comprise comprised d of several several test script scripts s that detail detail that functional functionality ity.. The test cases will be reviewed to ensure that they cover the scenarios needed to adequately test the site and its functionality. Each test case will will have an expected expected result result and a pass/fail column. column. If the expected result is achieved a value of “Y” will be recorded in the actual results results column. If the expected expected result result is not achieved achieved a value of “N” will be recorded in the actual results column, and the defect will be logged in the issue-tracking database. The actions that led to the failure and an assessment of its severity will also be noted in the issue-tracking database.
The development team will fix defects based on the level of severity assigne assigned d by the test team. team. The defect defect informa informatio tion n will will be record recorded ed in the issue-tracking database (Bugzilla), and the developers will be informed of each new new issue issue via via emai email. l. The sever severit ity y leve levels ls to be used used durin during g the the test test are described in the Defect Management portion of this document.
The test team will receive notification via email after each defect has been corrected corrected and unit tested tested by the development development team. The test team will retest the defect by re-executing the test case and script in which the defect defect was found. found. The regression regression test will will verify that the the alte altere red d code code has has not not adve advers rsel ely y impa impact cted ed prev previo ious usly ly work workin ing g functionality.
The test team will track all the test cases and test scripts using a Traceability document.
Incl Includ uded ed with within in the the scope scope of the test test is an exte extern rnal al inte interf rfac ace e test, test, desi design gned ed to veri verify fy that that all all comp compone onents nts prov provid ided ed by thir third d part party y prov provid ider ers s interface and interact according to specifications. A Separ Separate ate test test envi enviro ronm nment ent will will be estab establi lish shed ed for for all all hard hardwar ware, e, softw softwar are e and and brow browse serr confi configu gura rati tion ons s suppo support rted ed.. Refe Referr to Hard Hardwar ware e and and software requirements section for more information.
Following diagram explains the flow of test types/phases followed for Internationalkids.com Application.
Page 23 of 34
Page 24 of 34
INTERNATIONAL-KIDS.COM Test Flow:
Testing of International-kids.com application would be performed at feature level. A two week internal build release approach will be adopted while testing. Inte Integr grat atio ion/ n/Fu Func ncti tion onal alit ity y Test Testin ing g star starts ts as soon soon as the the firs firstt set set of feat featur ures es is developed/released by the development team by adopting build series procedure. This process continues till the completion of System Testing. Development team will decide and inform testing team about the set of feature that are planned for every build release so that Test Scenarios/Test cases would be developed and reviewed well in advance The typical flow of activities that happen in a 2-week QA test process cycle (Build Series) can be summarized through the Table given below. Day Series Phase Activities Monday Start of Build Build Series N: Test initialization activities Series N Build Series N: Receive Build N and Release notes by 1 P.M Build Series N: Deploy the build Build Series N: Run Smoke test cases and Round 1 testing Tuesday
INTERNATIONAL-KIDS.COM Test Flow:
Testing of International-kids.com application would be performed at feature level. A two week internal build release approach will be adopted while testing. Inte Integr grat atio ion/ n/Fu Func ncti tion onal alit ity y Test Testin ing g star starts ts as soon soon as the the firs firstt set set of feat featur ures es is developed/released by the development team by adopting build series procedure. This process continues till the completion of System Testing. Development team will decide and inform testing team about the set of feature that are planned for every build release so that Test Scenarios/Test cases would be developed and reviewed well in advance The typical flow of activities that happen in a 2-week QA test process cycle (Build Series) can be summarized through the Table given below. Day Series Phase Activities Monday Start of Build Build Series N: Test initialization activities Series N Build Series N: Receive Build N and Release notes by 1 P.M Build Series N: Deploy the build Build Series N: Run Smoke test cases and Round 1 testing Tuesday Build Series N: Round 1 Testing Wednesda Build Series N: Round 1 Testing y Thursday Build Series N: Round 1 Testing Friday Build Series N: End Round 1 Testing Build Series N+1: Features/Modules F eatures/Modules acquisition, planning, effort estimation, resource allocation Saturday Sunday Monday Build Series N: Start Round 2 Testing Tuesday Build Series N: Round 2 Testing Wednesda Build Series N: Round 2 Testing y Build Series N+1: Submit Test Scenarios/Cases for Review Thursday Build Series N: Round 2 Testing Friday End of Build Build series N: Test summery/conclusion Series N report generation by the end of day Build Series N+1: Update test cases based on review feedback, prepare for series N+1 Assuming that the Test case/scripts Execution process begins from 15-Jan-2007(subjected to change) , QA team will execute the following testing cycles by considering 1) Smok Smoke e Tes Testt pas pass s 2) Functionalit Functionality y test pass(which pass(which includes includes testing testing types like like Functionalit Functionality, y, GUI/Usability, Database and non-functionality test type like Security) These are in turn divided into two categories based on builds “1”- Execution of test cases of features included in current build “2”- Re-Execution of test cases of features included in all the previous builds
Page 25 of 34
INTERNATIONAL-KIDS.COM 3) Integration Integration test pass(which pass(which includes includes testing testing types types like module interfac interfaces es and External interfaces) 4) Regression Regression test pass(which pass(which include includes s testing types types like Regression, Regression, Adhoc Adhoc and defect fix verification) 5) System test test pass( which which includes includes all the above above types types of testing and testing testing types types QA Durati Smok Functionali Integratio Regr Syste Test Cases/ Cycle on e ty n essio m Scenarios / Build n Updation/ 1 2 1 2 Series Addition 1 2 weeks 2 2 weeks 3 2 weeks 1 week Buffer time 4 2 weeks 5 2 weeks 6 2 weeks 7 2 weeks 1 week Buffer time 8 2 weeks 9 2 weeks 10 2 weeks 11 2 weeks like Performance, Load/Volume, Compatibility )
1.26Defect Tracking and Management The defect management process ensures maximum efficiency for defect recognition and resolution. The objectives of this process are:
To maintain a defect tracking system to t o reliably monitor defects and fixes. To preserve a history of defects and their fixes. To ensure prompt and efficient identification and notification not ification of defects. To provide timely fixes and deployment.
QA team will use Bugzilla (a defect tracking tool), which will allow PA developers and QA members to carry out a full test cycle: find, log, assign, fix, verify, resolve, and close. The number of defects that surface during the QA testing period, including their potential impacts and complexity to implement, can be quite unpredictable. The PA Technical Lead / Project Manager will respond to defects in the minimum time possible, and assign fixe fixes s to a part partic icul ular ar bu buil ild. d. Care Carefu full revi review ew of the the impac impactt of an impl implem ement ented ed fix fix will will minimize reoccurrence and/or the introduction of new problems. However, since testing alone cannot fully verify that software is complete and correct, PA takes a comprehensiv comprehensive e validation validation approach. QA processes processes are integrated integrated into all stages
Page 26 of 34
INTERNATIONAL-KIDS.COM of the PA Development from the start of the engagement (e.g., large scale planning, unit testing, etc.).
Bugzilla defect tracking tool will be used for defect tracking and reporting purpose. It can be accessed via the web: • URL = • Project name = International-kids.com • Each team member will be given a User ID and Password Following are activities performed during Defect tracking process: 1) A Te Test en engineer ex executes th the te test ca case/script an and c co ompares th the actual result result with the expected. expected. He/She enters enters test results results under results results column in test case document across each test case by marking “Pass”/”Fail”. 1. When a te test ca case fa fails, after re result is is up updated in in te test ca case do document, A defect defect is entere entered d into into Bugzil Bugzilla la and the corres correspon pondin ding g defect defect refere reference nce num number ber is mentioned in the test report (test case document used for testing). 2) Following information is is en entered fo for every defect in each defect report: 1. Bug num umb ber 2. Summery 3. Desc Descrripti iption on 4. Steps Steps to re-cre re-create ate the proble problem m 5. Attac Attachm hment ents s if any any 6. Configuration Configuration the problem problem was was found in in (Browser/Os/ (Browser/Os/versi version) on) 7. Function/comp Function/component/ onent/module module the problem problem was found in 8. Sever Severit ity y of of pro probl blem em 9. Owne Owner/ r/Ass Assig igne ned d to 10.URL 11.Status 12.Submit 12. Submit Date Date 13.Submitter/Reporter 14.Resolution 2. The de defect i s as assigned t o th the QA QA le lead, wh who w ilill in in t ur urn m on onitor a llll t he he defects for completeness before submission to Development Tech Lead. 3. All defects will be checked for duplicate defects in Bugzilla before submission to Development Tech Lead. 4. Defects should be reproducible before being submitted to development Tech lead. 5. QA le lead wil will mon monitor all all def defects tha that are are in in the the esc escalation pro process. Th The defect defects s will will be classi classifie fied, d, managed managed and escala escalated ted using using a proces process s agreed agreed upon upon between AHA and Professional Access. 6. Tec Tech lea lead d al along ong with with mod module ule lea lead d wi will revi eview the defe defec cts. ts. If a de defect fect is val valiid defect, Tech lead will assign it to respective developer or else reject it by specifying the reason and re-assign it to respective reporter/submitter 7. Defe Defect cts s wil will be fixe fixed d base based d on sev severi erity. ty. Thos Those e defe defec cts enter ntered ed as a Sever everit ity y 1 (Cri (Criti tica cal/ l/Sho Showst wstopp opper er), ), or Seve Severi rity ty 2 (High (High)) mu must st be corr correc ecte ted d prio priorr to the application application being being deployed. Severity Severity 3 defects (Medium) (Medium) will be corrected corrected based on consensus agreement between Project Manager, Technical Lead and QA Test Lead regarding their criticality.
Page 27 of 34
INTERNATIONAL-KIDS.COM 8. The p er erson, who has been assigned the defect, c ar arries out the impact analysis (identifies the cause of the problem, identifies the impacted components and also identifies the fix to be carried out) and then fixes the defect appropriately. He records the impact analysis briefly in the Bugzilla. 9. Inte ntegration/Sy Syst ste em test test ca cases are are up updated ted if if the the de defect has has be been esc escap ape ed due to the lack of corresponding Integration/System test case and Integration/System testing that was carried out by respective Submitter/Reporter. 10.. 10 Defe Defect cts s if if any any are are cap captu ture red d and and trac tracke ked d for for clos closur ure e usi using ng the the Bugz Bugzil illa la.. 11. T he Regression testing is performed by ideally re-running Integration/System tests of the the changed programs. The modified components are rebaselined on successful conclusion of these tests. 12.. 12 The The pro produ duct ct is re-i re-int nteg egra rate ted, d, revi revise sed d com compo pone nent nts s are are bu buil iltt and and re-r re-run unni ning ng of full system and integration testing is carried out. Test cases are re-executed under following circumstances After a fix / a change / an enhancement. • Re-verify all functions of each build of application. • No new problem introduced by fix / change ("ripple effect"). • During System Testing. • The diagram below provides an overview of the defect tracking process:
Page 28 of 34
Page 29 of 34
Internation-kids.com Defect Classification:
Defect Defects s identi identifie fied d by the PA testin testing g team team will will be classif classified ied based on the guidel guideline ines s explained explained in the subsequent sections. sections. Apart from the guidelines guidelines,, the context context of a defect defect also has to be considered considered for proper classification classification of the defect. The defects can fall into one of the following categories: Severi ty Level 1
Title
Critical/ Showstopper
Description
Causes global data corruption Missing Missing functionalit functionality y critical critical to site operation that was defined in specifications Critic Critical al functi function on not operat operation ional al (typic (typicall ally y crash, crash, severe application deficiency or malfunction); no work around exists A defect that would adversely impact the reputation of the client and is a critical business issue
Internation-kids.com Defect Classification:
Defect Defects s identi identifie fied d by the PA testin testing g team team will will be classif classified ied based on the guidel guideline ines s explained explained in the subsequent sections. sections. Apart from the guidelines guidelines,, the context context of a defect defect also has to be considered considered for proper classification classification of the defect. The defects can fall into one of the following categories: Severi ty Level 1
2
Title
Description
Critical/ Showstopper
High/Major
3
Medium/Normal
4
Low/Cosmetic/Min or
Causes global data corruption Missing Missing functionalit functionality y critical critical to site operation that was defined in specifications Critic Critical al functi function on not operat operation ional al (typic (typicall ally y crash, crash, severe application deficiency or malfunction); no work around exists A defect that would adversely impact the reputation of the client and is a critical business issue Component system hang or local data corruption. An Emergency defect defect that has been determined determined to have a work around. Non-critic Non-critical al function function not operational; operational; no work around. If a work around is determined to be available, then defect will be reclassified as Medium. Slow performance Examples: link broken but accessible via another click stream, garbled text in a paragraph, invalid data in a fields, unapp pprroved conte ntent, and pai painfu nfully slow downloads. Non-critical Non-critical function not operational operational;; workaround workaround is available None Noness ssen enti tial al feat featur ure e or func functi tion on is miss missin ing g or broken Operation is not user ser friendly dly or somew mewhat hat inconvenient Display typos or misalignments that do not affect system operation Bewildering dialog boxes or instructions Inaccurate spelling or grammar Functional or usage defect, which does not hamper the major usage of the application Display typos or misalignments that do not affect system operation. An incorrect color on an element An incorrect object label Operation is somewhat inconvenient. This defect is not related to application functionality and mainly consists of aesthetic and usage issues
INTERNATIONAL-KIDS.COM Priority:
Priority describes the importance and order in which a bug should be fixed. The available priorities are: Priority level P1
Priority
High
P2
Medium
P3 P4
Low Very low
Definition
Resolve the defect with immediate effect in very next release: Prevents further testing • Full feature unavailable • • Client request • Severe impact on client • Effects other features Resolve the defect at the earliest, before intermediate release (if any) Normal defect, Resolve before Final Client release Could be fixed based on Triage Enhancements • Necessity of the bug fix for the final Client release •
1.27Update Documents and Results
Update the test scenarios, test cases and scripts if and when functionality workflow changes. Update the test case documents with results (Pass/Fail) , every time when test cases are executed Update the test case documents when there is no test case corresponding to the defect raised due to un usual flows if any. Update Update Tracea Traceabil bility ity matrix matrix every every time time when when Scenar Scenarios/ ios/case cases s are upd update ated d or added. Develop the Test Results Report (Daily). Prepare and Review Conclusion Report.
1.28Test Reports Reports Status Reporting
1) Bugzilla Bugzilla will will be used used to log log bugs. Bug report report should should have have sufficient information to reproduce the bug 2) QA testing testing will will be reporte reported d to the Project Project manager manager on a daily/weekly by Producing Testing Results reports. Test Results Reports should include, but is not restricted to the following: Report name Fields to include Individual project Name of tester status report Types of testing performed Number of test cases/scripts executed by him/her
Page 31 of 34
INTERNATIONAL-KIDS.COM Test case/script execution report
Def Defect ect sta statu tus s rep repor ortt
Defects requiring escalation report
Number of test cases/scripts not executed by him/her Number of defects logged (valid, Invalid, duplicate) Number of features available for testing Total number of test cases/scripts generated Number of test cases/scripts executed per tester Types of testing performed Percentage of total test scripts completed Tota Totall num numbe berr of of def defe ects cts log logge ged d Total number of defects verified/closed Total number of open defects Issues if any Total number of Severity 1 Defects Total number of Severity 2 Defects Total number of Severity 3 Defects Components/functional areas affected Date detected Current Status
3) QA team and Project Manager will conduct daily/weekly bug scrub • • • • •
meeting. The following information will be discussed. Current status vs. planned (are we on schedule?). Test cases/scripts execution completed (can be at feature level). Number of defects open and their severity (Bugzilla). Summery of QA progress Issues that need clarification/action
Conclusion Report
Upon conclusion of the QA test cycle, the QA/Test Lead will document the results of the test phase of the International-kids.com system system in the Conclusi Conclusion on Report. Report. This This report report contains information such as:
QA Test cycle number, duration and dates List of test cases executed Test team members Test case results Number of defects logged with status Metrics to quantify success of the project Copy of defect log
Test Summary Report
Test Summary Report would be a combination of all the above reports to present the final testing status during Intermediate/final Release.
1.29UAT and Closure
International-kids.com to perform User Acceptance Testing for the migrated Internationalappliicati cation on will will use use the the International-kids.com -UAT -UAT envi enviro ronm nmen ent. t. The The kids.com appl
Page 32 of 34
INTERNATIONAL-KIDS.COM team will will make make this this enviro environme nment nt avail availabl able e either either through through their their International-kids.com team hosting provider or will host it internally. The purpose of these tests is to confirm that the system system is develo developed ped accordi according ng to the specif specified ied user user requir requireme ements nts and is ready ready for operat operation ional al use. use. The follow following ing are the anticip anticipate ated d tasks tasks in making making this this enviro environmen nmentt available: Apache • PHP,Ajax, Java Script • MYSQL instance with data ready for testing • PA will coordinate with International-kids.com Deployment Specialist for the configuration on the environment and will provide International-kids.com code, consolidated migration scripts that are ready for installation into User Acceptance Environment and will perform resolution of defects from the User Acceptance Testing. Testing will be deemed complete upon the execution of all of the following: •
•
• •
5
All Functionality/Integration/System test cases. All outstanding issues are reviewed and accepted by the International-kids.com and PA Project Teams. These issues will include include all major, critical, critical, and blocker defects. Received signed-off creative and production components. The product is acceptable, which is based upon the following principles: System testing is 100% complete complete and all fixed issues have been regressed regressed o and closed. Calculation includes the total number of valid open bugs divided by the total o number of bugs in the system. Total Open Bugs include bugs that are Unconfirmed, Assigned, New or Reopened, whereas Total Bugs include all bugs in the database with no exclusions. All Major, Critical, and Blocker/showstopper bugs are closed. o 95% must be maintained upon the release of the project. o
Conf Config igur urat atio ion n mana manage geme ment nt
Please refer to the Configuration Management document that explains about the complete Configuration management workflow to be followed.
6
Deliverables
NOTE: NOTE: Follow Following ing dates dates are projec projected ted with with the assump assumptio tion n of begin beginnin ning g the th Construction phase on 11 Dec 2006. Actual dates will be modified as per the Project plan once the construction phase begins. Deliverable name Test plan Test Scenarios FunctionalArea1: Application Submission FunctionalArea2: Peer Review FunctionalArea3: Pre Awards FunctionalArea4: Post Awards FunctionalArea5: Reports and Admin Non functionality requirements (Performance, Security, e.t.c) Test cases FunctionalArea1: Application Submission
Deliverable Date 10-Oct-2007
Page 33 of 34
INTERNATIONAL-KIDS.COM FunctionalArea2: Peer Review FunctionalArea3: Pre Awards FunctionalArea4: Post Awards FunctionalArea5: Reports and Admin Non functionality requirements (Performance, Security, e.t.c) Test case execution report Test results reports Test summary/conclusion report
Page 34 of 34