BPI Information Technology Testing – Build Phase
Description
Examination and fine-tuning of key components of the IT system to ensure performance and readiness for production. Before fine-tuning, the initial performance of the system is often significantly lower than clients expect. Technical tuning is therefore required to ensure acceptable performance.
System testing typically includes
Unit Testing: Formal tests applied to each 'unit' of functionality within the system
Integration Testing: Test of the entire overall business solution including the passage of data to and from other integrated systems
Data Load / Data Conversion Tests: Tests that data prepared for cut-over to the new system is acceptable as described in Data Migration.
Volume/Stress Testing: Creating transactions (and file sizes) that simulate normal and peak work loads, thus verifying response times, processing times, file capacity and timing (e.g. to allow effective scheduling of computer runs).
Client Value
IT testing validates that the operation of the IT system meets the defined 'functional requirements'. Since these tests are performed during IT system development, problems can be identified early in the development life-cycle. (The earlier errors are discovered, the faster and cheaper they can be corrected.)
Inadequate or incomplete IT testing often results in performance problems or in significant overruns in engagement cost and schedule. Typically, problems that are not discovered until after system implementation receive the highest visibility and have the greatest impact on the operations.
Approach
The overall objective of Information Technology Testing is to demonstrate that the system is suitable for 'live' usage. IT testing focus on two elements: validation that the system design meets user requirements and verification that the system functions as designed. Management can then implement the system in confidence, knowing that the IT system will meet the needs of the business solution.
The following steps are repeated for each type of testing performed.
Define type and phases of testing to occur during the system development life-cycle.
Testing is essential at several stages in the delivery of new systems. During the initial planning stages, plan (and obtain approval for) the test requirements and strategy, defining:
Objectives for the phase of testing
Objectives and detailed work-steps for each test
Anticipated results for each test, as appropriate.
In the case of package software, focus on whether the functionality of the package meets the organization's defined needs. Do not attempt to test all the program code included in the basic underlying package software, as that is primarily the vendor's responsibility (and represents a large part of the package software cost). For custom-developed software, include the underlying structure of the programs that have been developed in testing plans.
Develop the plans for testing.
Determine the conditions under which the system will be tested. (It is probably not possible, and certainly not reasonable, to test every single set of circumstances that can arise. The testing needs to strike a reasonable balance between comprehensive coverage and risk.)
Compose a series of actions/expected results which prove that the systems performs correctly.
Determine/assign testing responsibilities.
Set up a special 'test environment” to ensure that assessments can be 'clean' (i.e. will not suffer from unpredictable results due to the sharing of other data or program versions on the same equipment).
Define back-up/recovery procedures.
Build in the flexibility to change versions and scenarios freely, particularly when parts of a test must be repeated once a problem is resolved. Given the high number of system changes that can occur during Information Technology Testing, clearly define and rigidly enforce backup and recovery procedures.
Conduct the Information Technology Test. Report and investigate “incidents”. Repeat tests with corrections.
Testing normally leads to a high volume of 'incident' reports, typically caused by a misunderstanding or miscalculation of the expected results (e.g. due to unanticipated data inputs resulting in a different test).
In a minority of cases, some action may be required to change the system itself. In such cases, make these changes without undue delay as they can hold the system testing process. Where the system is changed in any way, previous tests should be repeated (if they might have been affected by these changes).
Compile and evaluate final test results and obtain appropriate sign-offs.
Tune performance of the IT system based on test results. The main aspects that need to be tuned and tested are:
Physical size of files and databases
Real-time performance average transaction times per type of transaction
Length of batch runs
Percentage of processor and other resources utilized i.e. how much other work can be done while the system is running
Capability of the communications network to handle the loading's and peak concurrency
Efficiency of processes achieving performance requirements without wasting resources.
Guidelines
Problems/Solutions
Often, test schedules are sacrificed to compensate for design and development schedule overruns. Subsequently, testing is incomplete and rushed and unforeseen problems appear during implementation and initial deployment. Ensure that adequate time is allotted to conduct tests.
Tactics/Helpful Hints
Carefully manage client expectations during the completion of this deliverable. There is no such thing as an error-free IT system, so the client must understand that testing is a balance of cost against risk. The appropriate investment level and level of risk must be understood and decided by the client, not External Consultant.
Recognize that solid advance planning for IT Testing is key to its success. IT Testing should encompass all system functions and processes to used in production. Where beneficial, divide the overall testing into a number of cycles, each building on the results of others. Ensure definitions, procedures and expectations for the tests are precise and clear. Document all test results and ensure problems are addressed by the appropriate level of management.
Remind the client that a well-designed testing environment and a defined test base are useful for validating future system releases and ongoing enhancements.
Obtain agreement on the scope of functional testing, by drawing upon the following typical rules:
Pick at least one case of each valid transaction type.
Take the valid cases through a complete life-cycle (e.g. creation amendment update processing archiving deletion)
Take the system through its main types of update run (e.g. first day of the month, normal daily update, month end year end, first day of new year).
Test the validation rules on each field of each screen involved, e.g. mandatory / optional /hidden, error messages - fatal / warning etc.
Pick at least one case of each error type for which there is a specific defined error handling process (in both real-time and batch updating, if appropriate)
Test at least one case for each record type where the fields of data are all at their maximum and/or minimum values
Put through sufficient transactions such that all reports have at least three records per sub-total and three sub-totals per total
Generate at least one example of each report (or section of report).Process data from interfaces
Normal, correct data with all normal record types,
With reconciliation errors
With fields blank or missing
With invalid data in fields (e.g. alpha characters in numeric fields
With more than one run's data in a single update run)
Check reversing out the update in the case of feeder system errors.
Resources and Timing