Wednesday, July 6, 2016

Project Delivery Process D812

D812 - Evaluate Testing and Report Results

SIIPS Delivery Processes (D).png

DEFINITION

SIIPS D812.pngCompletely evaluate and document the results of the testing processes.

SUMMARY

The purpose of this activity is to collate, summarise, analyse, and evaluate the results of the complete testing.  Depending on the complexity of the testing programme, it may be appropriate to undertake these reviews, reports and signoffs per stage of testing rather than just once for the overall testing.
During the tests, valuable data has been collected, errors have been encountered and resolved, and test cases have been initiated and reinitiated.  It is during this activity that both the testing processes and software are evaluated to determine how well the testing programme at this level has met the testing objectives.
The results, incident reports and outstanding issues are reviewed.  Further action may be recommended, either to be completed prior to live running or to be postponed until a later stage of development or maintenance.
The overall responsible user manager must signify that the results are adequate for a live service to be commenced.  Note that this rarely means a perfect set of test results.

PATH PLANNING GUIDANCE

Optional - testing is always reviewed and signed off (see processes D800/D810).  This process is used where there are large testing programmes and reporting/signoff requirements such that formal reports are required in addition to the testing control information.

DEPENDENCIES

Prerequisites (Finish-Start):
  • Functional testing (unit tests, system tests, integration tests, user acceptance tests etc) - D800
  • Technical testing and tuning - D810
Dependent procedures (Finish-Start):
  • Decision to go live - D900

RECEIVABLES

  • Test objectives
  • Test definitions
  • Test logs
  • Incident logs
  • Incident reports
  • Test signoffs

DELIVERABLES

  • Report on testing effectiveness
  • Report on overall status of system based on test results
  • Management summary report (if appropriate)
  • Overall signoff by overall responsible user manager / project sponsor

TOOLS

  • (none)

DETAILED DESCRIPTION OF TASKS

These reviews and reports allow the results of complex testing programmes to be summarised such that they can be accepted by the business owners of the system.  In very complex projects where several business managers are involved, it may be appropriate to repeat the reviews, reports and signoffs per stage of testing, for example, upon the completion of all unit tests, system tests, interface tests, conversion tests, integration tests, and technical tests.

Confirm Testing Completeness

The purpose of this activity is to confirm the status of the testing processes by identifying any areas in the original test plan where modifications remain necessary to complete the testing or by indicating that the current position is adequate to meet the overall business objectives.
Tasks may include:
  • check completeness of coverage of the testing programme, eg all significant business processes, all significant aspects of functionality, all reasonably foreseeable error conditions,
  • identify any additional testing required,
  • identify modifications to test scripts,
  • identify modifications to procedures,
  • identify further necessary enhancements which were not initially planned,
  • produce overall summary of status.
A summary report should be produced which may include:
  • description of status
  • summary of entire test level
  • planned vs actual by test
  • indication/analysis of problem areas or test cases
  • indication if testing was complete and adequate for all manual and automated procedures
  • report of down time.

Confirm System Status

When testing has been completed, a report should be produced to publish the conclusions from the testing processes.  The purpose of this activity is to:
  • describe how well the system withstood the testing,
  • identify the need for modifications or enhancements that may have been recognised during testing but which have been deferred due to time and/or scope issues.
This review should:
  • identify any comments concerning software (from status reports or documentation)
  • identify any software enhancements by module
  • identify modifications or potential enhancements
  • describe software status in summary format.
The report may contain:
  • findings on software stability
  • description of additional features/functions which may benefit the organisation
  • enhancements to software
  • how well the objectives reflect requirements from analysis
  • areas where improvements may be considered
  • issues which it has been accepted do not need to be addressed prior to live running.

Test Summary Report

Where an inappropriate amount of detail has been produced in the testing reports, it may be appropriate to produce a summary report which covers both the status of the testing program and the status of the application software at a high level for senior management review.

Signoff


Individual tests were signed off by the agreed responsible user manager(s) for the relevant area of testing.  Based on the accumulated results and analysis as reviewed and summarised in this process, the overall responsible user manager or project sponsor should accept that the system’s testing programme has demonstrated that the system is suitable for live running.  There will normally be reservations noted at this time - perfection is never achieved in IT systems.  Any outstanding issues may be logged for resolution after live running commences or it may be accepted that it is not beneficial to the business to resolve certain issues.  This can sometimes be a difficult balance of costs against benefits against risks.  Such decisions are business decisions and must be taken by the business owners of the system - not the project team.

No comments :

Post a Comment