Saturday, April 29, 2017

BPI Information Technology Testing – Build Phase

BPI Information Technology Testing – Build Phase

Description

  • Examination and fine-tuning of key components of the IT system to ensure performance and readiness for production. Before fine-tuning, the initial performance of the system is often significantly lower than clients expect. Technical tuning is therefore required to ensure acceptable performance.    
  • System testing typically includes   
    • Unit Testing: Formal tests applied to each 'unit' of functionality within the system   
    • Integration Testing: Test of the entire overall business solution including the passage of data to and from other integrated systems   
    • Data Load / Data Conversion Tests: Tests that data prepared for cut-over to the new system is acceptable as described in Data Migration.       
    • Volume/Stress Testing: Creating transactions (and file sizes) that simulate normal and peak work loads, thus verifying response times, processing times, file capacity and timing (e.g. to allow effective scheduling of computer runs).

Client Value

  • IT testing validates that the operation of the IT system meets the defined 'functional requirements'. Since these tests are performed during IT system development, problems can be identified early in the development life-cycle. (The earlier errors are discovered, the faster and cheaper they can be corrected.)    
  • Inadequate or incomplete IT testing often results in performance problems or in significant overruns in engagement cost and schedule. Typically, problems that are not discovered until after system implementation receive the highest visibility and have the greatest impact on the operations.

Approach

The overall objective of Information Technology Testing is to demonstrate that the system is suitable for 'live' usage. IT testing focus on two elements: validation that the system design meets user requirements and verification that the system functions as designed. Management can then implement the system in confidence, knowing that the IT system will meet the needs of the business solution.
The following steps are repeated for each type of testing performed.
  1. Define     type and phases of testing to occur during the system development life-cycle.   
    1. Testing is essential at several stages in the delivery of new systems. During the initial planning stages, plan (and obtain approval for) the test requirements and strategy, defining:
      1. Objectives for the phase of testing       
      2. Objectives and detailed work-steps for each test           
      3. Anticipated results for each test, as appropriate.       
    2. In the case of package software, focus on whether the functionality of the package meets the organization's defined needs. Do not attempt to test all the program code included in the basic underlying package software, as that is primarily the vendor's responsibility (and represents a large part of the package software cost). For custom-developed software, include the underlying structure of the programs that have been developed in testing plans.   
    3. Develop the plans for testing.       
      1. Determine the conditions under which the system will be tested. (It is probably not possible, and certainly not reasonable, to test every single set of circumstances that can arise. The testing needs to strike a reasonable balance between comprehensive coverage and risk.)   
      2. Compose a series of actions/expected results which prove that the systems performs correctly.           
      3. Determine/assign testing responsibilities.   
  2. Set up a special 'test environment” to ensure that assessments can be 'clean' (i.e. will not suffer from unpredictable results due to the sharing of other data or program versions on the same equipment).    
  3. Define back-up/recovery procedures.
    1. Build in the flexibility to change versions and scenarios freely, particularly when parts of a test must be repeated once a problem is resolved. Given the high number of system changes that can occur during Information Technology Testing, clearly define and rigidly enforce backup and recovery procedures.        
  4. Conduct the Information Technology Test. Report and investigate “incidents”. Repeat tests with corrections.
    1. Testing normally leads to a high volume of 'incident' reports, typically caused by a     misunderstanding or miscalculation of the expected results (e.g. due to unanticipated data inputs resulting in a different test).   
    2. In a minority of cases, some action may be required to change the system itself. In     such cases, make these changes without undue delay as they can hold the system testing process. Where the system is changed in any way, previous tests should be repeated (if they might have been affected by these changes).
  5. Compile and evaluate final test results and obtain appropriate sign-offs.   
  6. Tune performance of the IT system based on test results. The main aspects that need to be     tuned and tested are:   
    1. Physical size of files and databases
    2. Real-time performance average transaction times per type of transaction
    3. Length of batch runs
    4. Percentage of processor and other resources utilized i.e. how much other work can     be done while the system is running       
      1. Capability of the communications network to handle the loading's and peak     concurrency       
      2. Efficiency of processes achieving performance requirements without wasting resources.

Guidelines

Problems/Solutions

  • Often, test schedules are sacrificed to compensate for design and development schedule     overruns. Subsequently, testing is incomplete and rushed and unforeseen problems appear during implementation and initial deployment. Ensure that adequate time is allotted to conduct tests.        

Tactics/Helpful Hints

  • Carefully manage client expectations during the completion of this deliverable. There is no such thing as an error-free IT system, so the client must understand that testing is a balance of cost against risk. The appropriate investment level and level of risk must be understood and decided by the client, not External Consultant.    
  • Recognize that solid advance planning for IT Testing is key to its success. IT Testing should encompass all system functions and processes to used in production. Where beneficial, divide the overall testing into a number of cycles, each building on the results of others. Ensure definitions, procedures and expectations for the tests are precise and clear. Document all test results and ensure problems are addressed by the appropriate level of management.    
  • Remind the client that a well-designed testing environment and a defined test base are useful for validating future system releases and ongoing enhancements.
  • Obtain agreement on the scope of functional testing, by drawing upon the following typical rules:   
    • Pick at least one case of each valid transaction type.   
    • Take the valid cases through a complete life-cycle (e.g. creation amendment update processing archiving deletion)   
    • Take the system through its main types of update run (e.g. first day of the month, normal daily update, month end year end, first day of new year).
    • Test the validation rules on each field of each screen involved, e.g. mandatory / optional /hidden, error messages - fatal / warning etc.   
    • Pick at least one case of each error type for which there is a specific defined error handling process (in both real-time and batch updating, if appropriate)
    • Test at least one case for each record type where the fields of data are all at their maximum and/or minimum values   
    • Put through sufficient transactions such that all reports have at least three records per sub-total and three sub-totals per total    
    • Generate at least one example of each report (or section of report).Process data from interfaces                
      • Normal, correct data with all normal record types,        
      • With reconciliation errors       
      • With fields blank or missing       
      • With invalid data in fields (e.g. alpha characters in numeric fields       
      • With more than one run's data in a single update run)       
      • Check     reversing out the update in the case of feeder system errors.       

Resources and Timing

  • Clearly specify the roles and responsibilities of the User Manager and Test Leader     during Information Technology Testing activities:
    • User Manager review and sign off each specific aspect of the testing.
    • The User Manager works with the project team to see that the tests are comprehensive and satisfactory and accepts the results on behalf of the client organization, by:       
      • Ensuring that the definition of the tests provides comprehensive and effective coverage of all reasonable aspects of functionality   
      • Checking that the expected results have been properly calculated   
      • Reviewing the impact of any incidents, problems and issues raised and     assisting in their resolution.       
      • Ensuring that the final outcomes of the tests are satisfactory.            
    • Test Leader (often a core project team member) the person responsible for monitoring detailed definition and execution of each part of the testing .    
      • Consulting with the User Manager and all interested parties to define the objectives, and the test details.   
      • Managing the preparation of test scripts.       
      • Managing the conduct of the tests, including detailed planning, booking resources (human/environmental), as well as inter-dependencies with development or other testing activities.       
      • Monitoring the usage and follow-up of any test incident control forms.   
      • Reviewing and obtaining agreement on the results of the tests with the User     Manager user and other interested parties.       
      • Ensuring the overall delivery of the successfully completed test.

No comments :

Post a Comment