Tuesday, April 12, 2016

Project Selection Process S190

S190 - Receive and Mark Responses

SIIPS Selection Processes (S).png

DEFINITION

S190 - Mark Responses.pngReceive, evaluate and mark responses to the Invitation to Tender.

SUMMARY

Log and acknowledge tenders or proposals received from vendors in response to the Invitation to Tender (ITT).
Collate the information received for use in the evaluation process and in preparation for the production of the selection report.
Evaluate responses in terms of:
  • compliance with critical (mandatory/knockout) requirements, and
  • quantified extent to which responses meet the detailed questions and requirements.
Any omissions or queries may (at the discretion of the team) be referred back to the vendor for clarification.

PATH PLANNING GUIDANCE

This process is normal practice.

DEPENDENCIES

Prerequisites (Finish-Start):
  • Issue of Invitation to Tender + time for the vendors to reply
Prerequisites (Finish-Finish):
  • Agreement of detailed scoring scheme and actual weightings.
Dependent procedures (Finish-Finish):
  • Identify Issues (S200)
  • Confer with Vendors (S210)
  • Consult with Reference Users (S220)
  • Package Fit Tests (S180)
  • Agree final marks and issues (S230)
RECEIVABLES
  • Agreed scoring scheme
  • Prioritised Requirements Matrices
DELIVERABLES
  • Acknowledgements to vendors
  • List of Compliant Tenders
  • Scored Requirements Matrices

TOOLS

  • Examples: Requirements matrices, scoring schemes, spreadsheets etc as established in Process S060 - Define and agree selection scoring scheme.
    • Structured Scoring Scheme
    • Scoring Spreadsheet
    • Hardware/Software Comparison Worksheet
    • Vendor Comparison Worksheet
    • Cost Comparison Worksheet
    • System Quality Chart
    • Vendor Quality Chart
    • System Quality/Vendor Quality Chart
    • Cost Summary Chart
    • Application Comparison Worksheet
    • Hardware Comparison Worksheet
    • Vendor Comparison Worksheet

DETAILED DESCRIPTION OF TASKS

Receive and acknowledge responses

Some time after issuing the ITT it may be a good idea to make sure that  the vendors have received the ITT are planning to respond to it.  As the deadline approaches it may be useful to contact the vendors and check that they will be able to meet the deadline.  In some cases it may be found that all vendors are struggling to meet the timescale and it might be appropriate to allow an extension.  Any changes in the timescale or other arrangements should be communicated to all vendors.
As the vendors submit their responses it is good practice to note their arrival and confirm this to the vendor.  Either a simple letter or a telephone call will normally suffice.  The team should keep note of which proposals have been received and who has them.  As the copies arrive they may be distributed to the personnel who will be reviewing them.
An initial check should be made on the format of the proposals to see that the basic rules have been followed and that it will be possible to assess them.  In a few cases, vendors will have failed to appreciate the need to comply with the required format and it is good practice to give them a second chance to comply before rejecting their submissions.

Review proposals for compliance with critical requirements

This review is based on the defined “criticality” of the requirements as established normally in Process S070.  The Requirements Matrices should hold codes to show whether each requirement or question is considered absolutely mandatory.  In principle, any proposal not meeting any “knockout” mandatory requirement should be rejected without wasting any further effort.
Clearly, this is a serious decision to make.  Though the purpose of the evaluation process is to eliminate unqualified vendors, contriving reasons to eliminate alternatives is contrary to the objectives of the evaluation.  It is important to check the facts very carefully and, in particular, to confirm that:
  • the vendor really cannot meet the requirement,
  • there is no practical alternative way of meeting the underlying need, and
  • the requirement is genuinely so vital that a solution could not be considered without meeting it.
These facts should be confirmed with the vendor, with the project’s sponsor and with other relevant management in the client organisation.  If the facts are confirmed, the proposal should be rejected and no further effort should be wasted in assessing it.  Although the vendor should be informed of this decision, it may be wise to defer this action until all the proposals have been received and reviewed - there have been many cases where the ITT has asked the impossible and the needs of the client organisation have had to be reviewed.
If the proposal fails to meet any “business critical” requirements that are not of a “knockout” nature, these will be placed on the issues list (see Process S200).
Tenders which have been accepted as satisfying all the “knockout” requirements are considered “compliant”.  A “List of Compliant Tenders” should be issued.  It should be agreed with the client organisation that these compliant vendors should be taken through for full evaluation as described below.

Collating the information

The information from the vendors should normally have been received in the requested format, possibly using “electronic” documents such that the collation and comparisons can be partially automated.  It may be appropriate to collate this information or reformat it to help in the evaluation process and for the preparation of the final report (see Process S250).
It is unusual to re-enter all the responses manually.  If the responses have been collected automatically in electronic documents it may, however, be practical to combine all answers into a single spreadsheet or table format so that full comparisons can be made on any detailed question or requirement.
It is more useful to create tables showing the comparison of key facts.  For example - hardware requirements, cost comparisons, vendors’ capabilities.  Where specific tables were used in the Invitation to Tender the data collected may easily be tabulated (see the example layouts in Examples: Request For Proposal and the example worksheets shown below).  This can be particularly useful regarding the costings of the various proposals.
The collected data may be grouped into tables showing the details of each vendor in separate columns.  The following examples are included in the toolkit:
 
Application Comparison Worksheet
Collates information from each vendor about the proposed applications.  Holds basic summary information such as release date and number of installations.
Hardware Comparison Worksheet
Collates information from each vendor about the proposed hardware configuration and operating environment.  This example assumes the hardware is to be procured for the new system.  A similar format could be used to collect facts relating to the degree of expected usage of an existing hardware configuration.
Vendor Comparison Worksheet
Collates basic information from each vendor about their organisation, references, technical support capabilities and technology infrastructure etc.  It gives a feel for the reliability of the vendor as a future business partner.
Cost Comparison Worksheet
Collates responses from each vendor about the expected costs for each alternative over five years.
Cost Summary Chart
Summarises and charts the estimated cost responses from the Cost Comparison Worksheet

Mark the proposals for compliance

The proposals should be assessed according to the agreed scoring scheme (see Process S060).  This is normally a manual process where the degree of compliance is assessed by relevant project team members according to the agreed scheme.

Some guidelines:

  • Never give different proposals to different people to mark - this can lead to differences in the standards applied by each assessor affecting the score of specific vendors,
  • Divide the review by topics giving sections of the proposals to appropriate members of the project team to assess for all vendors.
  • To encourage “buy-in,” it is recommended that client users, MIS and management actively participate in the evaluation process, especially those who will be primarily involved with ongoing system use and maintenance.
  • To encourage continuity through the system implementation process, those client personnel who will be actively involved in the system implementation process should be actively involved in the selection process.  It is important that the system’s owner and the project sponsor are also actively involved.
  • Let everyone read the proposals first without attempting to mark them.
  • Hold an initial workshop for the evaluation team to:
    • Collect and consider any initial problems or key issues (this could be combined with the identification of failed “knockout” requirements as described above).
    • Make sure all assessors understand the process and, in particular, the use of the scoring scheme and spreadsheets.
  • Evaluate all proposals in detail, collecting compliance scores and logging any key issues and zero scores (see Process S200).
  • Hold an evaluation workshop to discuss any problem areas in the approach to marking or specific problems with the vendors’ responses.  Final adjustments to the scores can be made during this workshop unless a significant amount of work is involved.
  • Collect together the compliance scores and combine them into an overall comparison spreadsheet showing detailed and summary comparisons for all the vendors against the detailed questions and against the defined levels of summarisation.
A number of scoring and evaluation tools are available in the toolkit, for example:
System Requirements Worksheet
Contains detailed requirements from the Requirements Matrices plus columns for scores, modification costs, cross reference and textual comments for explanations etc.  Can be used in one of two ways:
  1. As a self-assessment response form to be included in the ITT to collect responses from each vendor about how well a proposed system meets the client organisation’s specific requirements.
  2. For the evaluation team to collect and collate assessments, scores, modification costs and comments relating to the responses.
System Scoring Worksheet
Collect scores for each vendor’s response about how well a proposed system meets the client’s specific requirements
Scoring spreadsheet
Example spreadsheet for calculating weighted scores from compliance scores and weights, for producing summary charts and for producing layers of sub-totalling.
SQ/VQ Charts
Various example summary and detailed charts of the system quality and vendor quality findings.

Dealing with proposed modifications

Vendors may volunteer solutions including proposed modifications to their standard products or including custom programmed add-ins.  Such proposals should not be rejected out of hand, but should be assessed in terms of:
  • Practicality - is it a practical option?
  • Elapsed time - how long will it take, is there enough time, will it affect the delivery date?
  • Staffing requirements - are there sufficient human resources to cope with the customisations and/or modifications?
  • Risk - will the reliance on customisation unduly increase the project’s risks, eg risk of errors, risk of delays, risk of additional costs?
  • Costs - what is the effect on the project’s costing and the overall cost/benefit analysis for the new system?
  • Future support - will the changes or customised components make it difficult or expensive to maintain the system and, in  particular, to upgrade to future releases of the packaged components?
Generally speaking, a solution relying on modifications is less satisfactory than one where the requirements can be met within the standard facilities of the packages.  Due consideration should be given to these factors in the assessment.  The scoring will reflect a lower level of compliance but this will not result in a significant numerical difference.  Required modifications should be noted in the issues list (see Process S200), their full effect should be included in the cost comparisons, and they should be discussed as specific issues during the final evaluations.

Dealing with alternatives

Sometimes a vendor will offer alternative solutions or options for some of the requirement.  If possible, the preferred approach should be decided early and the full comparison made on that basis.  If necessary, the options may be assessed separately, but this will often mean treating the permutations of solutions as entirely separate proposals.

Make sure that options are not mixed up in the assessment.  For example, if the functionality was assessed on the basis of a specific option being included then the cost comparison should also be made on that basis.

No comments:

Post a Comment