Navigation Area The latest news about Team Synergy Information about Projects undertaken by Team Synergy Team Synergy Member Profiles Team Synergy Member Management Team Synergy Project-related Documentation Team Synergy Meetings Archive Downloads relating to Team Synergy Projects Funny Stuff Team Synergy Logo
Icon Navigation Home Help / Frequently Asked Questions Products Site Site Map Email the Webmaster

Documentation
Documentation Home
BKernel Javadoc Documentation (No Frames)
Cohesion - Client User Guide
Cohesion - Administration Manual
Configuration Management
CVS Documentation
Debug Package Documentation
Demerit Point Scheme
Management Procedures
Package Description
Project Plans and Procedures
Samba
Software Design Spec.
Software Requirements Spec.
Syndicate User Manual
Technical Style Standards
Web Document Style Standards
Work Log & Weekly Plan Proc.
WWW Document Template

Table of Contents
1. System Development Plan 1.1 Major Milestones
1.2 Gantt Chart
1.3 Procedures 1.3.1 Requirements Analysis
1.3.2 Verification
1.3.3 Requirements Document Development
1.3.4 Validation
1.3.5 Solution Analysis
1.3.6 Product Specification and Design
1.3.7 Product Implementation
2. Configuration Management Plan & Procedures

3. Development Support Environment

4. Verification & Validation

4.1 Testing Plan Introduction 4.1.1 Testing Plan Purpose
4.2 Testing Plan Scope
4.3 Lifecycle Phase Independent Activities 4.3.1 Critical Analysis and Risk Assessment
4.3.2 Document Review

4.4 Lifecycle Phase Dependent Activities 4.4.1 Analysis Evaluation
4.4.2 Design Evaluation
4.4.3 Software Development Evaluation
4.4.4 Test Evaluation
5. Testing Plan

6. Quality Assurance Process Plan

7. Documentation Plan

7.1 Document Completion Procedure
7.2 Transfer Procedure
7.3 Feedback
8. Delivery Plan 8.1 Project Deliverables 9. Distribution Procedure 9.1 Deliverable Distribution Procedure 10. Marketing Plan 10.1 Product Advertisement
Project Plan & Procedures
 
Version 1.21d
 
Edition 1
Document ID:  PP&P 
Author(s):  Phillip Hawthorne
Tristan Austin
Matthew Sykes
Danny Goulas 
Reviewed by:  Daniel Woodhouse
Jaycee Phua
Matthew Sykes
Simon Hutchison
George Panagiotopouls
Nicholas Phipps 
Date:  2 September 1999 


Revision History

Date 
Modifications 
Reason 
Version 
1999.04.07  Various sections combined (DW)  Initial document collation  0.1d 
1999.04.08  Document contents formatted (JP)  Conformance to Style Specification  0.3d 
1999.04.08  DSE Section (PH) added (JP)  Progressive collation of document  0.4d 
1999.04.08  Reviewer added (MS)  Quality Assurance check  0.7d 
1999.04.08  Various content changed (JP)  QA Reviewer suggestions (MS)  0.7 
1999.04.09  Collation complete (all major sections added) (JP)  Preparation for 1st Draft submission  1.0d 
1999.08.02 Initial revision to reflect new team structure and changes to team organisation (PH)  Document was out of date 1.1d 
1999.08.19 Made necessary changes from PH review. (DW) Document was out of date 1.2d
1999.09.02 Cover Page formatting updated (JP) Reflect changes in Style 1.21d



Table Of Contents

 
1. System Development Plan
1.1 Major Milestones
1.2 Gantt Chart
1.3 Procedures
1.3.1 Requirements Analysis
1.3.2 Verification
1.3.3 Requirements Document Development
1.3.4 Validation
1.3.5 Solution Analysis
1.3.6 Product Specification and Design
1.3.7 Product Implementation
2. Configuration Management Plan & Procedures

3. Development Support Environment

4. Verification & Validation

4.1 Testing Plan Introduction
4.1.1 Testing Plan Purpose
4.2 Testing Plan Scope
4.3 Lifecycle Phase Independent Activities
4.3.1 Critical Analysis and Risk Assessment
4.3.2 Document Review
4.4 Lifecycle Phase Dependent Activities
4.4.1 Analysis Evaluation
4.4.2 Design Evaluation
4.4.3 Software Development Evaluation
4.4.4 Test Evaluation
5. Testing Plan

6. Quality Assurance Process Plan

7. Documentation Plan

7.1 Document Completion Procedure
7.2 Transfer Procedure
7.3 Feedback
8. Delivery Plan
8.1 Project Deliverables
9. Distribution Procedure
9.1 Deliverable Distribution Procedure
10. Marketing Plan
10.1 Product Advertisement



  

1. System Development Plan

1.1 Major Milestones

In regard to application development, the major milestones are broadly defined. The following table lists the milestones in relative chronological order along with their dependent tasks, the number of people working on them, and the date they are due. Some milestones are listed as ongoing as they are defined in more a number of processes. The last three milestones don't have due dates as they are to be announced.
 

Milestone  Milestone description  Dependency  Resources  Due Date 
Requirements Analysis  29 April 1999 
Verification  ongoing 
Validation  ongoing 
Requirements Solution Statement  3 May 1999 
Requirements Refinement  ongoing 
Solution Analysis  tba 
Product Specification & Design  1 October 1999
Product Implementation  tba 
Product Testing  4 - 6  1 October 1999

Table 1.1 - 1: Major Milestones
 

1.2 Gantt Chart

The schedule as outlined below only defines the first five tasks we are to complete. This is simply because we don't know enough information about the later tasks at this early stage. As more information comes to hand, this will be updated accordingly.

Not that the verification and validation tasks here only indicate the processes in regard to the functional prototype, not the final system. This is why in the above milestones chart they are listed as ongoing and here we state their start and end dates.

Gantt Chart
Figure 1.2 - 1: Gantt Chart

In the above chart, the yellow bars indicate the planned start, duration and end dates while the dashed bar represents the "slack" time in which we can finish the task without adversely affecting the time specified for later tasks.
 

1.3 Procedures
 

1.3.1 Requirements Analysis

In order to build the correct system we need to analyse the given requirements in detail. The requirements statement given to us only gives us a broad idea regarding what functionality the system needs to provide. From this we need to flush out what is needed and what is not.

In order to do this effectively we have chosen to implement a functional prototype. This will enable us to show the client our exact interpretation of the requirements. The client can then say, "No that's not what I meant," "Yes that's it" or better yet, "Yes but I want it to look like this."

During the process of building the prototype we will not only be determining what our interpretations of the requirements are, but also learning about the technologies required to implement them and more concretely establishing our development and quality assurance processes.
 

1.3.2 Verification

Verification is basically the process of determining whether we are building the right product. Once the prototype is complete we will spend about two weeks in consultation with our client performing requirements verification. This will involve an iterative two step process:
 

1   Determine what's missing or not required

2   If updates still required

2.1   Update the prototype
2.1.1   Return to step 1
2.2   Otherwise stop verification


It should be noted that verification is an ongoing process. At the each completion of each milestone, we will need to perform a verification analysis to make sure we are still on track. Similar to the above process, this will be in conjunction with the client.

Following the completion of this verification phase, we will be ready to begin defining the system we intend to build more precisely. This will be in the form of the Requirements Solution Statement.
 

1.3.3 Requirements Document Development

The requirements solution statement is an abstract definition of the functional and non-functional requirements of the system as we have interpreted them. It will be targeted at both the application engineers and the client and as such will be in a form readable and understandable by both. It will serve as the contract between ourselves and the client by defining exactly what we have to provide. The requirements document also serves as a basis for QA to develop testing procedures for the development/testing phase of the project.

As stated above, verification will once again take place. We will present the solution statement to the client and perform a requirements review. This will involve working through each functional and non-functional requirement, explaining the implications of each to the client. Any omissions, errors, conflicts or contradictions need to be formally recorded and resolved.

1.3.4 Validation

Validation is the process of making sure we are building the product correctly. That is, making sure it is not faulty or "buggy." Throughout development, weekly builds of the software will be declared and handed to the Quality Assurance (QA) team.

QA will then perform various testing procedures on the product and present the defects to the application engineers to fix. Once all the defects found by QA have been fixed, another build is declared and the process repeats itself.

The actual testing process is defined more precisely in the testing section later in this document.
 

1.3.5 Solution Analysis

Once we have our final solution statement, we need to design the final product. Given that we have already built the functional prototype by this stage, we should have a good idea as to how to go about building the system.

We will be able to look at the design of the prototype and determine:

  • what worked well
  • what needs to be redesigned
  • what needs to be thrown away altogether
  • what needs to be added.
This will form the basis of the Product specification and design.
 

1.3.6 Product Specification and Design

This will be based on the solution statement and our solution analysis. It will include implementation specific information such as class diagrams, object models and so forth. At this stage the specification and design process is relatively undefined, as the project develops more information will become available.
 

1.3.7 Product Implementation

As above, the process of implementation will become clearer as it comes closer and will be defined at a more appropriate time.
 
 
 

2. Configuration Management Plan & Procedures

From the point where there is a functional user interface skeleton, builds will be made at intervals set by the Application Engineer.

Each build will be a tagged version, named sequentially build-1, build-2, etc. These builds are sent to QA and if they are deemed suitable to be released as versions, they are released as internal versions. The criteria used by QA to determine what qualifies as a version is specified in this document under the QA and Test Plan sections [See Section 5 ].

When versions are to be made, they will be named sequentially version-1, version-2. These builds and versions will be performed by the Configuration Manager. Any problems should be reported to the Application Engineer (author) and the Configuration Manager.

Instructions on checking out builds and versions will be published when the first build is created. The build and version numbers are for internal use only and do not relate to the public releases. These will be made at the discretion of the Application Engineer and subject to quality assurance approving a build. These will be given names such as 'Cohesion' and 'InSync'

Projected release dates will be published by the Application Engineer when they become available.
 
 
 

3. Development Support Environment

The software development environment for any Team Synergy product, is a suit of applications which allow quality software to be written in a flexible way. All of the development software is available in the Synergy software development laboratory. The following software is supported:

  • Windows 95/98/NT 4.0
  • Java 1.2 (Java 2) JDK from Sun Microsystems, Inc.
  • JRE 1.2 from Sun Microsystems, Inc.
  • Notepad
  • Symantec Visual Cafe 3.0
  • Vim for Win32
  • JBuilder 2.0
  • CVS 1.10 for Win32
  • PrettyTyper 1.0 from Team Synergy
Most of this software is either bundled with the Operating System, or available freely under the GNU GPL or other licensing agreements.

Developers are free to use other development tools. Before any code is checked in to the repository, it must compile using the supported tools. Furthermore, all code being checked into the repository should be passed through the PrettyTyper program to ensure that the Team Synergy-defined code style guide is adhered to. Developers may not receive support for problems caused by using unsupported tools.

As new tools become available and are tested, they may be added to the list of supported software.

The software listed above requires for following minimum system hardware specification to run:

  • Intel PentiumPro 200 MHz or higher
  • Windows 95 or Windows NT 4.0
  • Memory 64MB
  • At least 100 MB of free hard disk space
  • CD-ROM drive
  • SVGA or higher resolution monitor (800x600)
  • Mouse or other pointing device
  • Internet connectivity
Any system not meeting this requirement will not be able to install and/or run the required software.

Development under other operating environments and or operating systems is not supported. It may be possible to develop and build under the Solaris OS, or on private Linux installations, but this is not supported and any problems must be solved by the individual developers. All code developed under these environments must be tested under the supported environment before being checked in to the repository.
 

4. Verification & Validation

 
Note: This document has been adapted from the "Independent Release Verification and Validation Plan" - Prepared by INTERMETRICS: 6301 Ivy Lane Suite #200, Greenbelt, MD 20770 (USA). It has been adapted to suit the Team Synergy Project Verification and Validation Plan


4.1 Testing Plan Introduction

4.1.1 Testing Plan Purpose

The purpose of this Verification and Validation Plan for the Team Synergy Product, is to document the planned steps taken by Synergy to ensure that the product developed is of high quality in relation to what the customer has requested. In other words, it assures that we take steps so that:

  • We build the right product.
  • We build the product right.


4.2 Testing Plan Scope

This Verification and Validation Plan addresses the focus of Team Synergy development analysis activities to be performed during the period 15 March 1999 to 30 October 1999. Specific analysis activities are as follows:

  • Analyse software code and software development documentation (eg. software development plans, project instructions, configuration management plans). This is to assess whether the implementation is traceable to the design and of high quality (ie components comply with standards, are internally consistent, do not implement unintended functionality, support desired user interaction, and do not adversely impact the expandability of the system, etc.).
  • Witness the Team Synergy Product testing and assess the traceability and testability of the test results. Test witnessing activities will be coordinated with the Customer and Team Synergy Quality Assurance (QA) representatives.


4.3 Lifecycle Phase Independent Activities

Lifecycle Phase Independent Activities for the Team Synergy Product are those whose execution is independent of the particular lifecycle phase in which they are executed. This section addresses those activities and ad hoc document review support.

The Concurrent Versioning System will be the place to obtain code to conduct development analysis activities. The designated Team Synergy Quality Assurance representative(s) will be the person for coordinating test witnessing activities. The QA Team Leader will be the person responsible for test status reporting. Information on individual test execution and related technical questions will be available from QA Team members as needed.

The Team Synergy development analysis and test witnessing team will utilise Team Synergy designated office space at Swinburne University of Technology, Hawthorn Campus to facilitate communication and access to information.

4.3.1 Critical Analysis and Risk Assessment

One of the initial steps in planning and allocating Synergy resources to a release effort is to perform a Critical Analysis and Risk Assessment. The outcome of the study allows the Team Synergy team to assign priorities to the various release components and assures that the most critical areas receive adequate coverage.

4.3.2 Document Review

Software documentation reviews are conducted to observe measurable progress in the software completion process by reviewing and analysing delivered software design and development documentation. There are three kinds of document reviews conducted by the project team, namely:

  • Document revisions received via the Customer as a Configuration Change Request (CCR). Synergy responses are submitted in the form of Impact Analysis Reports. An example may be a CCR proposing changes to Level 3 requirements that could present a potential impact to the detailed design.
  • Checking for missing documents. This occurs when extra requirements or extra wanted functionality is received from the Customer.
  • Checking for redundant documents or document sections. This occurs when requirements or functionality wanted is removed by the Customer.


4.4 Lifecycle Phase Dependent Activities

Lifecycle Phase Dependent Activities are those performed during specific phases of the Team Synergy Product development lifecycle. The Team Synergy project team will review the following steps in the Team Synergy Product life cycle to ensure that the product is being built to fit the Customer's requests:

  • Analysis Evaluation (Prototyping)
  • Design Evaluation
  • Software Development Evaluation
  • Test Evaluation
4.4.1 Analysis Evaluation

Analysis evaluation consists of examining both the process in which Team Synergy performed the analysis for the Team Synergy Product and the actual requirements generated by the effort. Then, these requirements are compared to the known requirements received from the Customer, and the functionality required by the Customer.

4.4.2 Design Evaluation

Design Evaluation consists of examining both the process in which Team Synergy produced the design for the Team Synergy Product and the actual products generated by the effort. Follow-on design analysis will focus on reviewing the progress of software development processes and changes, and enhancements to design products.

4.4.3 Software Development Evaluation

Software Development Evaluation consists of the Team Synergy Team analysing software code and related documents to assess whether the implementation is traceable to the design and of high quality. The software will also be checked for standards compliance, internal code consistency, appropriate functionality, and support of desired user interaction, as appropriate. The Team Synergy project team will employ the following process in performing software development analysis:

  • Identify sub-system components
  • Obtain code snapshots of identified software components during the code walk through period through the Team Synergy Product.
The Team Synergy Product Software Development Evaluation is primarily product oriented (ie focus is examination of software code); however, the implementation of software development processes (eg. adherence to development standards) will be examined also.

4.4.4 Test Evaluation

Test Evaluation consists of the Team Synergy project team witnessing and independently analysing results of system tests performed by the QA Team. The Team Synergy project team will employ the following process in performing test evaluation:

  • Obtain system test and verification plans, and test schedules.
  • Review plans for sufficiency and completeness of test coverage and requirements traceability.
  • Identify tests to witness, critical requirements.
  • Witness tests.
  • Analyse test results provided by the QA Team to assess test conduct compliance to plans and procedures, and to verify that the results accurately and completely reflect the outcome of the tests.
The Team Synergy Product Test Process Evaluation focuses on how the QA Team test process is implemented. The evaluation examines the test plans for the system as well as the verification plan, and the system integration and test plans to assess the likelihood that the process will (continue to) yield the required implementation end-products.
 
 
 

5. Testing Plan

5.1 Test Cycle

The following is an outline of the planned testing strategy for all Team Synergy software development projects. It is important to note that throughout each cycle, code inspections and reviews will take place when time permits.

Any parts of code not testable through the black box / interface testing methods will be exercised by test harnesses written by QA team members and approved by the Project Manager, Development and QA team leaders.

For Team Synergy QA and non-QA internal procedures and issues relating to testing, please refer to the Quality Assurance Plan document available on the Team Synergy web site.

5.1.1 Planned Test Cycles

  • Unit Testing
  • Cluster Testing
  • Thread Testing
  • Stress/Performance Testing
  • Acceptance Testing
5.1.2 Unit Testing Cycle

This will involve the white box and black box methods of testing using test harnesses to locate defects within each class of the system. The outputs expected from the inputs entered will be determined from the requirements specification document.

5.1.3 Cluster Testing Cycle

The main focus will be with interface testing the parameter and procedural interfaces, yet minimal black box and white box testing will occur at this stage, the outputs expected from the inputs will be determined from the requirements specification document.

5.1.4 Thread Testing Cycle

Interface testing will take up the majority of work hours with a strong focus on the testing of the synchronised classes and message passing interfaces (anticipated). Thread testing will be repeated on the entire integrated system once all sub-systems are free of interface errors.

5.1.5 Stress/Performance Testing Cycle

The idea of this phase is to see how quickly and reliably the system operates under increasing loads and users. The aim is to identify possible problem areas that can be fixed in order to fine tune the system.

5.1.6 Acceptance Testing Cycle

This will be the final phase of testing. It will only take place after all the previous test phases have been completed and the system is error free within the development environment.

The purpose of this test phase is to test the performance of the system in its' real world working environment with real users. This is done in order to find errors not anticipated by the developers or QA staff.
 

5.2 Code Inspections

During the regular code inspections and reviews, particular attention will be paid to:

  • The use of well defined exception handlers, instead of using the default.
  • Use of product on all platforms.
  • Use of comments.
  • Maintainability of code.
  • "Magic Numbers" included for no apparent reason.
  • Loops with multiple exit or entry points.
  • Unreachable code.
  • Memory leaks.
  • Variables used without initialisation.
  • Variables declared but never used.
  • Possible array bound violations.
  • Uncalled functions.
  • Use of Public, Private, Protected and Published sections within classes.


5.3 Test Plans/Scripts

For each class, cluster of class, sub-system and the entire system, a test plan/script document will be written which has the following sections and content:

  • Test Conditions
     


     
     

    A brief explanation of the following information that contains the conditions in which the software tests will be conducted:

    • Pre-Test Background
      Brief explanation of what is expected to have occurred before the testing of the software begins. For example, "It is assumed that the software has been put through ad-hoc testing performed by the developer, and can run without any obvious fatal errors".
    • Test Site
      Where the tests are to be performed, ie "The tests are being conducted in the room EN211B Laboratory, Swinburne University of Technology, Hawthorn Campus".
    • Test Personnel
      The number and name of the persons involved of the testing of the software in question.
    • Test Environment
      The hardware and software specifications in which the tests will be conducted on.
    • Test Constraints
      Anything within the test environment that may inhibit the quality or productivity of the tests.
    • Test Methodologies
      What style of testing will be used, ie. Glass-Box (White-Box) testing.
    • Test Tools
      Any software tools that may be used to help test the software, ie. NuMega Bounds Checker.
    • Test Completion Criteria
      The criteria needed for the software to pass the testing process.


5.4 Test Overview

A brief introduction that explains the content of the following three sub-sections, which are:

  • Functional aspects that need to be tested
  • Non-Functional aspects that need to be tested
  • Test sections and example tests
5.4.1 Breakdown of Functional Aspects

A description/list of every aspect of the functional points of the software to be tested.

5.4.2 Breakdown of Non-Functional Aspects

A description/list of every aspect of the non-functional points of the software to be tested.

5.4.3 Test Sections and Example Tests

Description of the formulated test sections and sample tests that will be conducted within those test sections in table format.
 

Functional Tests 
Test Section  Example Tests Involved 
   
Non-Functional Tests 
Test Section  Example Tests Involved 
   

5.5 Functional Tests

5.5.1 Test - Section Name (ie. Exception Handling)

A brief description of the purpose of the test section.

Functional Test Inputs

An outline of the types of inputs entered by the tester or by the test harness.

Functional Test Outputs

An outline of the outputs expected from the software for the types of inputs entered. Also include any recording devices which will monitor the test process. If this includes files, list the names of the files that record information.

Procedures

Numbered detailed tests, their steps and expected results. The number of tests depends on the size of the software to be tested. With each of the tests (each number within this section), copy the following to the end of each number:
 

Pass:    Fail:    Refer to:     

5.6 Non-Functional Tests

5.6.1 Review (section of software to review)

A brief description of the purpose of the review section.

Procedures

A numbered list of things to review and report personal opinions on, ie Colour of background in the client logon dialog box.
 
 
 

6. Quality Assurance Process Plan

This document describes the following:

  • The structure the Synergy QA team will work in.
  • The procedures and protocol the Synergy QA team will follow.
  • Assumptions and expected standards from other Synergy members.
  • Actions taken for possible issues or scenarios that may arise.


6.1 QA Team
Web-Database Development
                                                                                     Data warehousing
                                                                                     Reporting
6.1.1 Team Structure
 

Name  Skills  Type of work assigned 
Matthew Sykes (TL)  QA Management and Process C++/C, Visual Design 
  • QA Management. 
  • QA Process document composition. 
  • Document reviews. 
  • Test plan composition. 
  • Testing implementation. 
  • GUI reviews. 
George Panagiotopoulos  Beta Testing, C/C++ 
  • Test plan composition. 
  • Document reviews. 
  • Testing implementation. 
  • GUI reviews. 
Simon Hutchison  Java 
  • Test plan composition. 
  • Writing of test harnesses. 
  • Code inspections. 
Jared Clinton  Web-Database Development, Data warehousing and Reporting 
  • Test plan composition?
  • Testing implementation? 
  • Internet Site reviews and testing?
  • Writing of test harnesses?
  • Code inspections?
Chung Siu Chu Java, C++ and HTML
  • Test plan composition?
  • Testing implementation? 
  • Internet Site reviews and testing?
  • Writing of test harnesses?
  • Code inspections?
Table 6.1.1 - 1: QA Team and Responsibilities

6.1.2 Team Meetings

Team Synergy QA will be meeting weekly at 4.30pm on Monday to discuss ongoing work and problems that may arise. These meetings are compulsory for all QA team members.
 

6.2 Documentation

Formal deliverable documents written by Synergy members will be reviewed by a QA team member for content correctness and completeness according to:

  • The user requirements.
  • The analysis and design outcome.
  • An expert peer reviewer’s opinion (may be a member outside the QA team, if the expert knowledge is lacking within the QA team regarding the document subject).
Non-content related issues, such as the spelling, grammar and style of document, are to be checked by the Publishing Group.
 

6.3 Internet Site

The Synergy Internet Site will be reviewed by one or more QA team members for content correctness and completeness according to:

  • The user requirements.
  • HTML Style Guide.
  • An expert peer reviewer’s opinion (may be a member outside the QA team, if the expert knowledge is lacking within the QA team regarding the document subject).
Non-content related issues, such as the spelling, grammar and style of website, are to be checked by the Publishing Group.

For the functional aspects of the Synergy Internet Site, software testing procedures will be followed according to the procedures and issues detailed in Section 6.5].
 

6.4 QA Release Software

The following section outlines the procedure for and the standards expected for the release of software to be tested.

6.4.1 Procedure for Releasing Software to QA

  • Check that the software written complies with the standards defined in the following section[SeeSection 6.4.2].
  • Any special requirements when running or compiling the software will be given in an e-mailed QA team memo written by the developer, giving instructions on how to fulfil these requirements.
  • Tag the build to build-build number.
  • E-mail the QA Team Leader regarding the availability of the software to be tested.
6.4.2 Standard of QA Release Code

Any code that is released to QA must adhere to the following standards:

  • Must be able to compile.
  • It must conform to the Synergy Technical Standards Document.
  • All code will be well commented.
  • No magic numbers in code without a very good reason.
  • The code must have passed quick ad-hoc testing performed by the developer.
  • Any temporary imported files in the code will be removed or commented out.
  • Any software which is passed to QA that doesn’t comply with the standards defined in this section will be handed immediately back to the author.


6.5 Testing Process

This section describes the procedures QA will follow in order to assure a complete and unbiased testing of the Synergy software product.

6.5.1 Procedure for Testing

For each section of the system to be tested, the following steps will occur:

  • A test plan will be written conforming to the software requirements specification document. Each test plan document will be written using the test plan document template in the /teamb/docs CVS directory named "QATestPlan.dot". These documents will be written in Microsoft Word for Windows format, using the 97 (8.0) Version.
  • The test plan document will be reviewed by a peer.
  • The testing start date will be written on the test plan document.
  • The compiled version of the software is retrieved from the /bin/qa CVS directory.
  • The testing environment is setup according to the test plan document specifications.
  • Each test step is run, recording the inputs and results where possible.
  • A "Pass" or "Fail" is given to each test.
  • When the test plan has been completed, the "end" date is written on the test plan document.
  • All bugs found are logged, adhering to the bug logging procedure defined in Section 7 of this document.
6.5.2 Issues Relating to the Testing Procedure

The following are issues relating to the process of the Synergy QA Team:

  • All test harnesses will be written by QA staff, and tested by QA staff other than the author of the software.
  • Before the test harnesses can be used for testing, they have to be authorised by the Application Engineer Team Leader.
  • Test plan documents must define if and how any testing tools are to be used for the testing of the software.
  • Testers that have had any involvement in the development of the software to be tested, will not be used in the testing of that software due to possible conflicts of interest.
  • Multiple testers testing the same software simultaneously MUST have identical software versions.
  • All testing environments must be noted in the test plan, and closely adhered to.
  • Multiple testers testing the same software simultaneously MUST have identical testing environments.
  • Testers will test all software strictly following the test plan documents. Any further tests required that are not outlined in the test plan, will be appended to the test plan document (editing appropriate document revisions), before those tests will be commenced.
  • Where possible, all tests must be recorded into a log file (using the log class). If no automatic recording of the test is possible, use a pen and paper to record the behaviour and results found.
  • When manually recording, the tester must also note the date, time, files used and their version.
  • Any bug found must be repeatable several times over before it is to be logged.
  • Aside from coding errors, bugs can be logged for:
    • The look or feel of a graphical user interface.
    • Missing user requirements.
    • Missing requirements determined from the analysis and design.
    • Misinterpretation of requirements.
    • Lack of on-line help.
  • Any bug logged relating to the tester’s opinion of the look or feel of the software must explain their reasons for their opinion, giving examples of problems encountered where possible.


6.6 Bug Notification

The KeyStone tracking system will be used for logging, viewing and editing bugs. The information on each bug logged will be as follows:

  • Date and time found.
  • The component name (ie. the Client Class).
  • The software version.
  • Concise description of the problem.
  • Type of bug (ie. Interface).
  • Severity of bug using scale of 1 - 5 [1: Fatal error; 3: Non-fatal functional error; 5: Cosmetic errors (ie. No on-line help content)].
  • Long description of test step, behaviour and result.
  • Current state: Not fixed, Open (ie. being fixed), Fixed.


6.7 Test Cycle Completion and Iteration

Each test cycle will be iterative. To move on to the next test cycle, either no bugs or only bugs of severity 5 must be present.

There must be no bugs logged in the Internet Site bug tracking system before any stress or performance testing can be commenced.
 

6.8 Code Inspections

All code inspections will be performed according to the Software Inspection Process document obtainable from the QA Team Leader.
 
 

7. Documentation Plan
 

7.1 Document Completion Procedure

The various steps to be followed for a given document to be completed are as follows:

  • The content of a document should be supplied to the Publishing Group by the group that is responsible for the document. This content can come in any form, from scraps of paper, to word-processed documents.
  • A draft is written of the document and is checked and verified for compliance with the Document Style Standards by the Team Synergy Technical Writers. At this stage the document should also be given a Version number and a document ID.
     


     
     

    Note: The draft should use the generic document template located here . This template contains the proper formatting and style according to the Team Synergy Document Style Standards.

  • The document is then passed from the Publishing Group to the Quality Assurance Team [See Section 7.2].
  • The Quality Assurance Team checks and verifies the documents for compliance with their standards.
  • The document is returned to either the Publishing Group (for grammar/readability issues) or the group that initiated the document (for content related issues). Feedback on corrections or improvements to be made before the final draft should be supplied. [See Section 7.3].
  • Repeat steps 3 to 6 until no further corrections/improvements are required for the deliverable.


7.2 Transfer Procedure

To ensure that a document has been transferred from one sub-group within Synergy to another and that each group is aware of the transfer the following steps must be adhered to:

  • A copy of the document, in HTML format, is placed in the 'docs' module of the Team Synergy CVS.
  • An email is sent to the entire Publishing Group (to inform them of the completion and handover of the document), the entire Quality Assurance Team (whose members will be delegated the task of checking the document by the Quality Assurance Head). The email will contain the following information:
    • Document ID
    • Author
    • Filename
    • Version number
    • Corrections/improvements made (only relevant if Quality Assurance has already seen the document and had sent it back for corrections)
    • Any necessary comments.
  • Finally the milestone associated with the document should be transfered between the appropriate groups in the Milestone Tracker system.


7.3 Feedback

Once the Quality Assurance Team has checked and verified the document for compliance with their standards, one of two steps will be taken:

  • If there are corrections or improvements to be made, an email is sent to the Quality Assurance Head (if it is anyone other than him who has checked the document), the entire Publishing Group (to inform them of the status of the document). The email will contain the following information:
    • Document ID
    • Filename
    • Date submitted
    • Version number
    • Reviewer
    • Problems discovered within document. Problems to be reported will include a lack of:
      • Correctness
      • Completeness
      • Redundancy
    • Section and Line Number (number of lines after the start of the current section) of the location of the errors.
    OR
  • If the document has passed all Quality Assurance procedures, an email is sent to the Chief Quality Assurance officer (if it is anyone other than him who has checked the document), the entire Publishing Team (to inform them to publish the document, and the completion of the document) and the Team Leader (to inform him of the documents completion). This email will contain the following information:
    • Document ID
    • Filename
    • Date Submitted
    • Version number
    • Reviewer
    • Document ready for release
In either case, the milestone associated with the document should be transferred between the appropriate groups to reflect the documents change in status.
 
 

8. Delivery Plan
 

8.1 Project Deliverables

For a deliverable to be ready for submission, it must have first followed the procedures outlined in the Documentation Plan [See Section 7]. This process verifies that the deliverable has met the Team Synergy Style and Quality Assurance Standards. Finally, the deliverable is published on the web site before 9.00pm on Friday of the due week and an email is sent to the assessor.
 
 

Deliverable Due Date Mark
Planning  On Going  10 
Problem Analysis/Solution  TBA  10 
Manuals and Product Evaluation  Week 12, Semester 1 & 2  10 
Support/Utility Software  Semester 1 (On Going)  10 
Product - Alpha  Week 6, Semester 2  10 
Product - Beta  Week 8, Semester 2  10 
Product - Final  Week 11, Semester 2  10 
Product Design Documentation  Week 1-11, Semester 2  10 
Web Site  On Going  10 
Process/Management  On Going  10 

Table 8.1 - 1: Major Deliverables
 
 

9. Distribution Procedure
 

9.1 Deliverable Distribution Procedure

When a given document is ready to be released, the following steps are to be taken:

  • Verify the document procedure has been followed [See Section 7]
  • The document is then published on the Team Synergy Web Site.
  • At this stage the Quality Assurance Team should verify that the web site is functional. ie. no broken links or missing images.
  • The Team Leader is to notified, as it is his/her responsibility to signl that the deliverable is "completed".
  • The Team Leader should send and emial to the everyone in Team Synergy (to notify them of the documents release) and the convenor, Rajesh Vasa (to inform him that the document deliverable has been submitted online).

 

10. Marketing Plan
 

10.1 Product Advertisement

Once all the relevant procedures have been followed to produce the deliverable, all that remains to be done is to promote the product by Team Synergy. To do this the following steps will be followed:

  • A statement will be issued on the Team Synergy web site in the 'What's New' section communicating the new release.
  • The Team Synergy web site will be used to advertise the products features and enhancements.
  • A marketing presentation will be given to inform the attendees, of the products' features and enhancements.
    This will include a demonstration of the product's uses and ability to complete its tasks efficiently and accurately.
In the months before the final product is released the primary focus of the web site will change from project management and development support to marketing, with the former role being taken up by a sub-area of the marketing site. A marketing team will be set, primarily drawing on the resources of the Publishing Group, but also drawing on other members of the team who possess relevant skills.

 

ÿ