Sunday, April 10, 2016

Project Selection Process S100

S100 - Automatic Weighting

SIIPS Selection Processes (S).png

DEFINITION

S100 - Automatic weighting.pngImportance weightings are calculated automatically based directly on the agreed "criticality" categorisation of each question.

SUMMARY

The detailed requirements stated in the Requirements Matrix are automatically weighted based upon the criticality classifications established earlier (see Process S070).  Simple rules are used to translate the criticality rating into a numeric weight.  The project sponsor and any other key decision makers should be consulted to agree the relative importance of groups of questions which will be used in calculating the summary levels and overall comparison during the analysis of the proposals received.
The deliverable should be a fully prioritised Requirement Matrix showing the relative weights of all detailed requirements and other questions.  This will subsequently be used in the evaluation, marking and comparison of vendors’ proposals.

PATH PLANNING GUIDANCE

This process is normal practice in short form selection.

DEPENDENCIES

Prerequisites (Finish-Finish):
  • Requirement Matrix
Dependent procedures (Finish-Start):
  • Receive and mark responses (S190)

RECEIVABLES

  • Requirement Matrix

DELIVERABLES

  • Fully prioritised Requirement Matrix

TOOLS

  • Examples: Structured Scoring Scheme
  • Examples: Scoring Worksheet
  • Examples: System Scoring Worksheet
  • Examples: Request For Proposal (includes questionnaires in appendices)
  • Requirements - library of example requirements matrices
  • Examples: System Requirements Worksheet
  • Examples: Data for System Quality/Vendor Quality Charts

DETAILED DESCRIPTION OF TASKS

Choice of approach

This process is intended to reduce the time and effort taken in agreeing weightings for the evaluation of vendors’ proposals.  A simple numerical system is used to translate the criticality categories into importance weightings.  These criticality categories will have been determined and agreed with the client organisation (see Process S070).
This is in contrast to the “full” approach where each weighting is evaluated and agreed with the relevant user representatives on its own merits (see Process S090).  Clearly, this automatic approach will not perfectly reflect the relative importance of each requirement.  The advantages and disadvantages may be summarised as follows:
Advantages of automatic weighting
Advantages of “full” weighting by importance
Saves time and effort.
Avoids the need to review and agree several hundred weighting factors with a large body of user representatives.
Allows the users to differentiate between different levels of importance.
Avoids overemphasis of insignificant but mandatory features (eg availability of an audit trail) and allows full weight  to be given to the key differentiating factors even where they are not mandatory.
Encourages participation by the key users.
Leads to greater “ownership” of the results.
 

Automatic weighting of individual questions

The chosen criticality categories should be translated into a simple scoring system, giving due weight to the knockout mandatory requirements, high importance to business critical requirements and low weightings to other requirements, for example:
Category
Weight
“Knockout” mandatory
5
Business critical
4
Desirable
1
The translation may be made manually by entering weightings based on this criteria, but it is normally more efficient to convert the data automatically within the spreadsheet.

Summary Levels

It will still be necessary to consider and review the relative weights given to groups of questions.  This is vital to avoid the greatest weight being given to subjects which have the greatest number of questions.  A top-down approach is recommended to divide the full requirements into various subjects, each being a proportion of the overall requirement.  Each of these lower levels can, in turn, be broken down into even lower levels if appropriate.
A top-down is illustrated below.  The components of each level of summarisation are weighted out of 100%, starting with the top level:
SIIPS structured top-down bottom-up scoring scheme.PNG
Each of the lower levels is itself then broken down into components weighted out of a total of 100%:
SIIPS structured top- bottom Lower Levels.PNG
The weights would normally be discussed and agreed with the project sponsor and other key decision makers within the client organisation.  The results may be typed directly into the scoring spreadsheet version of the Requirements Matrices (provided it has been set up to handle the summarisation automatically - see, for example, “Examples: Scoring Worksheet”).  Once agreed, the weights should not be revised unless a compelling reason is found.  It should not be routine practice to revise the importance weightings based on the proposals received as this encourages the manipulation of the questions to get the “right” answer.

Note that it is not normal practice to publish these weights as their publication does not increase the value of the process but can lead to distortions of the proposals and unnecessary arguments.

No comments:

Post a Comment