|
In regard to
application development, the major milestones are broadly defined. The
following table lists the milestones in relative chronological order along
with their dependent tasks, the number of people working on them, and the
date they are due. Some milestones are listed as ongoing as they are defined
in more a number of processes. The last three milestones don't have due
dates as they are to be announced.
Table 1.1 - 1: Major Milestones The schedule as outlined below only defines the first five tasks we are to complete. This is simply because we don't know enough information about the later tasks at this early stage. As more information comes to hand, this will be updated accordingly. Not that the verification and validation tasks here only indicate the processes in regard to the functional prototype, not the final system. This is why in the above milestones chart they are listed as ongoing and here we state their start and end dates. In the above chart, the
yellow bars indicate the planned start, duration and end dates while the dashed
bar represents the "slack" time in which we can finish the task without
adversely affecting the time specified for later tasks. In order to build the correct system we need to analyse the given requirements in detail. The requirements statement given to us only gives us a broad idea regarding what functionality the system needs to provide. From this we need to flush out what is needed and what is not. In order to do this effectively we have chosen to implement a functional prototype. This will enable us to show the client our exact interpretation of the requirements. The client can then say, "No that's not what I meant," "Yes that's it" or better yet, "Yes but I want it to look like this." During the process of
building the prototype we will not only be determining what our interpretations
of the requirements are, but also learning about the technologies required to
implement them and more concretely establishing our development and quality
assurance processes. Verification is basically
the process of determining whether we are building the right product. Once the
prototype is complete we will spend about two weeks in consultation with our
client performing requirements verification. This will involve an iterative two
step process: 1 Determine what's missing or not required
Following the completion
of this verification phase, we will be ready to begin defining the system we
intend to build more precisely. This will be in the form of the Requirements
Solution Statement. 1.3.3 Requirements Document Development The requirements solution statement is an abstract definition of the functional and non-functional requirements of the system as we have interpreted them. It will be targeted at both the application engineers and the client and as such will be in a form readable and understandable by both. It will serve as the contract between ourselves and the client by defining exactly what we have to provide. The requirements document also serves as a basis for QA to develop testing procedures for the development/testing phase of the project. As stated above, verification will once again take place. We will present the solution statement to the client and perform a requirements review. This will involve working through each functional and non-functional requirement, explaining the implications of each to the client. Any omissions, errors, conflicts or contradictions need to be formally recorded and resolved. Validation is the process of making sure we are building the product correctly. That is, making sure it is not faulty or "buggy." Throughout development, weekly builds of the software will be declared and handed to the Quality Assurance (QA) team. QA will then perform various testing procedures on the product and present the defects to the application engineers to fix. Once all the defects found by QA have been fixed, another build is declared and the process repeats itself. The actual testing process
is defined more precisely in the testing section later in this
document. Once we have our final solution statement, we need to design the final product. Given that we have already built the functional prototype by this stage, we should have a good idea as to how to go about building the system. We will be able to look at the design of the prototype and determine:
1.3.6 Product Specification and Design This will be based on the
solution statement and our solution analysis. It will include implementation
specific information such as class diagrams, object models and so forth. At this
stage the specification and design process is relatively undefined, as the
project develops more information will become available. As above, the process of
implementation will become clearer as it comes closer and will be defined at a
more appropriate time. 2. Configuration Management Plan & Procedures From the point where there is a functional user interface skeleton, builds will be made at intervals set by the Application Engineer. Each build will be a tagged version, named sequentially build-1, build-2, etc. These builds are sent to QA and if they are deemed suitable to be released as versions, they are released as internal versions. The criteria used by QA to determine what qualifies as a version is specified in this document under the QA and Test Plan sections [See Section 5 ]. When versions are to be made, they will be named sequentially version-1, version-2. These builds and versions will be performed by the Configuration Manager. Any problems should be reported to the Application Engineer (author) and the Configuration Manager. Instructions on checking out builds and versions will be published when the first build is created. The build and version numbers are for internal use only and do not relate to the public releases. These will be made at the discretion of the Application Engineer and subject to quality assurance approving a build. These will be given names such as 'Cohesion' and 'InSync' Projected release dates
will be published by the Application Engineer when they become
available. 3. Development Support Environment The software development environment for any Team Synergy product, is a suit of applications which allow quality software to be written in a flexible way. All of the development software is available in the Synergy software development laboratory. The following software is supported:
Developers are free to use other development tools. Before any code is checked in to the repository, it must compile using the supported tools. Furthermore, all code being checked into the repository should be passed through the PrettyTyper program to ensure that the Team Synergy-defined code style guide is adhered to. Developers may not receive support for problems caused by using unsupported tools. As new tools become available and are tested, they may be added to the list of supported software. The software listed above requires for following minimum system hardware specification to run:
Development under other
operating environments and or operating systems is not supported. It may be
possible to develop and build under the Solaris OS, or on private Linux
installations, but this is not supported and any problems must be solved by the
individual developers. All code developed under these environments must be
tested under the supported environment before being checked in to the
repository.
The purpose of this Verification and Validation Plan for the Team Synergy Product, is to document the planned steps taken by Synergy to ensure that the product developed is of high quality in relation to what the customer has requested. In other words, it assures that we take steps so that:
This Verification and Validation Plan addresses the focus of Team Synergy development analysis activities to be performed during the period 15 March 1999 to 30 October 1999. Specific analysis activities are as follows:
Lifecycle Phase Independent Activities for the Team Synergy Product are those whose execution is independent of the particular lifecycle phase in which they are executed. This section addresses those activities and ad hoc document review support. The Concurrent Versioning System will be the place to obtain code to conduct development analysis activities. The designated Team Synergy Quality Assurance representative(s) will be the person for coordinating test witnessing activities. The QA Team Leader will be the person responsible for test status reporting. Information on individual test execution and related technical questions will be available from QA Team members as needed. The Team Synergy development analysis and test witnessing team will utilise Team Synergy designated office space at Swinburne University of Technology, Hawthorn Campus to facilitate communication and access to information. 4.3.1 Critical Analysis and Risk Assessment One of the initial steps in planning and allocating Synergy resources to a release effort is to perform a Critical Analysis and Risk Assessment. The outcome of the study allows the Team Synergy team to assign priorities to the various release components and assures that the most critical areas receive adequate coverage. Software documentation reviews are conducted to observe measurable progress in the software completion process by reviewing and analysing delivered software design and development documentation. There are three kinds of document reviews conducted by the project team, namely:
Lifecycle Phase Dependent Activities are those performed during specific phases of the Team Synergy Product development lifecycle. The Team Synergy project team will review the following steps in the Team Synergy Product life cycle to ensure that the product is being built to fit the Customer's requests:
Analysis evaluation consists of examining both the process in which Team Synergy performed the analysis for the Team Synergy Product and the actual requirements generated by the effort. Then, these requirements are compared to the known requirements received from the Customer, and the functionality required by the Customer. Design Evaluation consists of examining both the process in which Team Synergy produced the design for the Team Synergy Product and the actual products generated by the effort. Follow-on design analysis will focus on reviewing the progress of software development processes and changes, and enhancements to design products. 4.4.3 Software Development Evaluation Software Development Evaluation consists of the Team Synergy Team analysing software code and related documents to assess whether the implementation is traceable to the design and of high quality. The software will also be checked for standards compliance, internal code consistency, appropriate functionality, and support of desired user interaction, as appropriate. The Team Synergy project team will employ the following process in performing software development analysis:
Test Evaluation consists of the Team Synergy project team witnessing and independently analysing results of system tests performed by the QA Team. The Team Synergy project team will employ the following process in performing test evaluation:
The following is an outline of the planned testing strategy for all Team Synergy software development projects. It is important to note that throughout each cycle, code inspections and reviews will take place when time permits. Any parts of code not testable through the black box / interface testing methods will be exercised by test harnesses written by QA team members and approved by the Project Manager, Development and QA team leaders. For Team Synergy QA and non-QA internal procedures and issues relating to testing, please refer to the Quality Assurance Plan document available on the Team Synergy web site.
This will involve the white box and black box methods of testing using test harnesses to locate defects within each class of the system. The outputs expected from the inputs entered will be determined from the requirements specification document. The main focus will be with interface testing the parameter and procedural interfaces, yet minimal black box and white box testing will occur at this stage, the outputs expected from the inputs will be determined from the requirements specification document. Interface testing will take up the majority of work hours with a strong focus on the testing of the synchronised classes and message passing interfaces (anticipated). Thread testing will be repeated on the entire integrated system once all sub-systems are free of interface errors. 5.1.5 Stress/Performance Testing Cycle The idea of this phase is to see how quickly and reliably the system operates under increasing loads and users. The aim is to identify possible problem areas that can be fixed in order to fine tune the system. 5.1.6 Acceptance Testing Cycle This will be the final phase of testing. It will only take place after all the previous test phases have been completed and the system is error free within the development environment. The purpose of this test
phase is to test the performance of the system in its' real world working
environment with real users. This is done in order to find errors not
anticipated by the developers or QA staff. During the regular code inspections and reviews, particular attention will be paid to:
For each class, cluster of class, sub-system and the entire system, a test plan/script document will be written which has the following sections and content:
A brief introduction that explains the content of the following three sub-sections, which are:
A description/list of every aspect of the functional points of the software to be tested. 5.4.2 Breakdown of Non-Functional Aspects A description/list of every aspect of the non-functional points of the software to be tested. 5.4.3 Test Sections and Example Tests Description of the
formulated test sections and sample tests that will be conducted within those
test sections in table format.
5.5.1 Test - Section Name (ie. Exception Handling) A brief description of the purpose of the test section. An outline of the types of inputs entered by the tester or by the test harness. An outline of the outputs expected from the software for the types of inputs entered. Also include any recording devices which will monitor the test process. If this includes files, list the names of the files that record information. Numbered detailed tests,
their steps and expected results. The number of tests depends on the size of the
software to be tested. With each of the tests (each number within this section),
copy the following to the end of each number:
5.6.1 Review (section of software to review) A brief description of the purpose of the review section. A numbered list of things
to review and report personal opinions on, ie Colour of background in the client
logon dialog box. 6. Quality Assurance Process Plan This document describes the following:
Team Synergy QA will be
meeting weekly at 4.30pm on Monday to discuss ongoing work and problems that may
arise. These meetings are compulsory for all QA team members. Formal deliverable documents written by Synergy members will be reviewed by a QA team member for content correctness and completeness according to:
The Synergy Internet Site will be reviewed by one or more QA team members for content correctness and completeness according to:
For the functional aspects
of the Synergy Internet Site, software testing procedures will be followed
according to the procedures and issues detailed in Section
6.5]. The following section outlines the procedure for and the standards expected for the release of software to be tested. 6.4.1 Procedure for Releasing Software to QA
Any code that is released to QA must adhere to the following standards:
This section describes the procedures QA will follow in order to assure a complete and unbiased testing of the Synergy software product. For each section of the system to be tested, the following steps will occur:
The following are issues relating to the process of the Synergy QA Team:
The KeyStone tracking system will be used for logging, viewing and editing bugs. The information on each bug logged will be as follows:
Each test cycle will be iterative. To move on to the next test cycle, either no bugs or only bugs of severity 5 must be present. There must be no bugs
logged in the Internet Site bug tracking system before any stress or performance
testing can be commenced. All code inspections will
be performed according to the Software Inspection Process document obtainable
from the QA Team Leader. 7.1 Document Completion Procedure The various steps to be followed for a given document to be completed are as follows:
To ensure that a document has been transferred from one sub-group within Synergy to another and that each group is aware of the transfer the following steps must be adhered to:
Once the Quality Assurance Team has checked and verified the document for compliance with their standards, one of two steps will be taken:
For a deliverable to be
ready for submission, it must have first followed the procedures outlined in the
Documentation Plan [See Section 7]. This process verifies that
the deliverable has met the Team
Synergy Style and Quality Assurance Standards. Finally, the deliverable
is published on the web site before 9.00pm on Friday of the due week and an
email is sent to the assessor.
Table 8.1 - 1: Major Deliverables 9.1 Deliverable Distribution Procedure When a given document is ready to be released, the following steps are to be taken:
Once all the relevant procedures have been followed to produce the deliverable, all that remains to be done is to promote the product by Team Synergy. To do this the following steps will be followed:
|