S250 - Prepare and Agree Selection Report
DEFINITION
Collate all selection evaluation information, analyses and findings. Prepare and agree the final Selection Report.
SUMMARY
Information will have been gathered throughout the selection exercise. The competing proposals have been evaluated, analysed and compared in a variety of ways. Further information will have been gathered from a number of sources such as reference users, demonstrations, Package Fit studies, Benchmarking etc. Based on all the evidence available, the selection team should already have discussed the issues and conclusions with the project sponsor and any other key decision makers. The findings and conclusions will have been agreed in principle.
All the evidence and conclusions are now collated and condensed into the form of a final report. The report should compare the vendors in both quantitative and qualitative terms, concentrating on key differences. It will state a recommendation on behalf of the project team and project sponsor.
The report should be agreed with the project’s sponsor who should accept ownership of the report and its recommendations. The recommendations may then need to be presented for formal executive approval according to the normal practices of the organisation.
PATH PLANNING GUIDANCE
This process is normal practice, but not mandatory. A client organisation may agree that the production of a formal report is unnecessary provided there is sufficient documentation of the evidence and findings.
DEPENDENCIES
Prerequisites (Finish-Finish):
- Mark responses(S190)
- Identify Issues (S200)
- Confer with Vendors (S210)
- Consult with Reference Users (S220)
- Package Fit Tests (if appropriate) (S180)
- Agree final marks and issues (S230)
- Benchmark (if required) (S240)
Dependent procedures (Finish-Finish):
- Negotiate terms with preferred vendor(s) (S300)
- Define Implementation Strategy / Delivery Approach (S700/S710)
RECEIVABLES
- vendors’ proposals
- comparative data based on vendor's’ proposals, including costs and benefits
- Zero Scores List
- selection issues list / log
- reports / questionnaires from vendor demonstrations
- reports / questionnaires from user reference site visits
- report from Package Fit tests (if any)
- findings from the final evaluation of marks and issues
- findings from Benchmarking
DELIVERABLES
- Selection Report
TOOLS
- Examples: System Recommendations Report
- Examples: System Recommendations Signoff letter
- Examples: Zero Scores List
- Examples: Selection Issues List
- Examples: System Demonstration Questions
- Examples: Site Visit Questions,
- Examples: scoring spreadsheets,
- Examples: Zero Scores List,
- Examples: Selection Issues List,
- Examples: Application Comparison Worksheet,
- Examples: Hardware/Software Comparison Worksheet,
- Examples: Vendor Comparison Worksheet,
- Examples: Cost Comparison Worksheet,
- Examples: System Quality/Vendor Quality (SQ/VQ) Charts,
- Examples: Cost Summary Chart,
- Benefit Realisation Core Guide - Benefit Model.
DETAILED DESCRIPTION OF TASKS
Collate final materials
The project team should have gathered together all materials collected during the evaluation. These may include:
- Original proposals and further correspondence from the vendors,
- System Demonstration Questionnaires or visit reports,
- Site Visit Questionnaires or visit reports,
- scoring spreadsheets,
- Zero Scores List
- Selection Issues List
- Package Fit Test reports,
- Application Comparison Worksheet,
- Hardware/Software Comparison Worksheet,
- Vendor Comparison Worksheet,
- Cost Comparison Worksheet,
- System Quality Chart,
- Vendor Quality Chart,
- System Quality/Vendor Quality Chart,
- Cost Summary Chart,
- Benchmark Report or other Benchmarking results.
Produce the Selection Report
There should be no need for further investigation or evaluation. This process is concerned solely with documenting the evidence that has been gathered and the findings that have been agreed.
This information should be presented in a way that highlights the key issues and justifies the chosen preferred approach. It may be necessary for the report to be in a specific format so that it can be used to request formal approval from the client organisation’s executive authority.
There is no prescribed format for the report. Typically, the Selection Report will contain:
- Introduction
- Background
- Scope
- Summary of the selection process
- Management Summary
- Requirement
- Key business needs
- Key constraints
- Comparisons
- System Quality Comparison
- System Functionality Comparison
- Software Design Comparison
- Hardware Design Comparison
- Overall System Quality Comparison
- Vendor Quality Comparisons
- Vendor Support Comparison
- Vendor Stability Comparison
- Overall Vendor Quality Comparison
- Cost Comparisons
- Proposed System Cost Comparison
- Status Quo Comparison
- Results of the Package Fit study
- Results of the Benchmarking
- Recommendations
- Key differentiating factors
- Conclusions
- Preferred solution
- Cost implications / formal financial request
- Outstanding issues
- Appendices - Optionally include details of:
- comparison worksheets
- Zero Scores List
- Selection Issues list
- full requirements
- full responses
- full scoring tables
- reference site visit reports
- reports from vendors’ demonstrations
The material included will depend on the circumstances. It is not normally necessary to include the full detail in the report, provided that the documentation remains on file and is cross referenced within the text.
Contents of the report
The contents of the report are generally statements of the facts, findings and conclusions from earlier processes. There are, however, a number of specific considerations in terms of these contents.
- The evaluation process, and, in particular, the Selection Report, is not intended to present a complete understanding of each software package. Finalists should be evaluated and documented only to the point that one system can be recommended.
- Comparisons may be made against a “Benefit Model”. This is a collection of individual quantified measurements of both tangible and intangible benefits or costs. It is intended to give a balanced view of benefits based upon the client organisation’s appreciation of what are the desired benefits from the new system.
- The cost evaluation should include all one-time costs (including implementation support), and on-going maintenance costs. An expected “system life” of five to seven years is typically used. Because of changes identified during the evaluation, vendors are frequently required to recompute cost estimates before the cost evaluation can be completed. To establish a common baseline, calculate the present value of the expenditure stream less the expected savings from scrapping and not supporting the existing system.
- Comparing only the maintenance costs to continue operation of the existing systems versus the purchase costs to implement the new technology strategy assesses only the “capital outlay” and may not provide an accurate assessment of the alternatives. Adding the operating costs to the direct costs often reveals that while the new system may be more expensive to purchase, it saves significant user and operating costs. Operating costs can be quantified by adding in the estimated time for users to perform the related manual functions and the MIS time to operate and maintain the system factored by a reasonable rate per hour representing cost of service.
- The project team should avoid making claims as to any efficiencies, reductions or savings as a result of implementing a particular solution, although they may assist in the preparation of a Cost Benefit Analysis, and the client organisation will need to be sure that the preferred specific solution will support the Cost Benefit Analysis.
- The team should also avoid making any claims as to the performance of any specific solution or to its freedom from system bugs or programming errors.
- Although the project team and Consulting personnel may be critical to orchestrating the evaluation process and providing recommendations, it is imperative that the client organisation understands that the responsibility for the final selection rests with themselves. Sometimes organisations become overwhelmed with making a long-term system commitment and have extreme difficulty in making that final selection. Professional judgment is needed to determine if additional analysis is needed or if the decision makers simply need “moral support”.
- There are a number of policy and preference decisions made by the client management team which influence the selection decision. Over time and with turnover in key management positions, it is common for management to have difficulty in remembering the circumstances and conditions under which these decision were made and attempt to remake the system selection decisions. These system selection assumptions should be documented along with relevant background material and included in the Implementation Plan.
- Because much of the information collected about the evaluation process is largely based on subjective responses, it is important to avoid statements and assessments that imply a higher level of precision than was actually used.
- Though extensive effort is given to tabulating, charting and comparing scores for each vendor, this information should be kept in context. Since it is highly unlikely to find the system and vendor that perfectly fits all the clients needs, determining the best solution for the client requires weighing trade-offs which often cannot be reduced to a chart. Though it is important that the analysis and findings are presented in a professional manner, it is much more important that the client choose the system that will best suit their needs.
Presentation and signoff of the report
The finalised report should be formally approved by the client organisation (For Example: System Recommendations Signoff letter). The project sponsor should accept the report and its findings. If the report is to be presented to the organisation as a request for formal executive approval, the report may be prepared in the name of the project sponsor rather than the project team.
The Selection Report may need to be presented to the decision making executive of the organisation. Materials from the report may need to be prepared in a suitable presentation format.
The presentation may take the form of a formal request for the Delivery segment of the project to be approved and financed. Note, however, that this application for approval may be more appropriately handled following the finalisation of terms with the vendors and the preparation of a detailed “Delivery Approach Definition” or “Implementation Strategy” document in which details of the overall project are considered, including the selection decision and also wider details of the approach such as timescales, staffing, interactions with other systems and projects etc (see Process S700/S710).
No comments:
Post a Comment