These guidelines are to be used when thinking about how to perform ongoing monitoring and adaptation. The guidelines concern only managing uncertainty/risk so they should be considered alongside any other guidelines or suggestions based on other factors.
The guidelines are designed to apply to any situation where monitoring and adaptation is done at a level where an agreed approach is considered worthwhile. They are worded very carefully to be as widely applicable as possible and that means they are not always as short as they would be if you were just writing for your own organization. Consequently, you may need to read each guideline more than once, carefully, and read any related examples, explanations, and even other documents referenced from this page. A lot of knowledge about how to manage risk has been condensed onto this page.
Usually it will be easy to think of ways to follow the guidelines but sometimes, if your situation is demanding, it may require quite a lot of thought and experimentation. What should not be difficult is justifying the guidelines. As far as possible, these are guidelines that most people think are obvious good sense.
Ongoing monitoring and adaptation are common in organizations at strategic, operational, project portfolio, programme, and project levels.
The approach to monitoring and adaptation should meet the following practice guidelines (in bold):
Monitoring takes place sufficiently frequently and also in response to unexpected events of such significance that monitoring is worthwhile.
Monitoring is informed by information about events and outcomes that are unwanted and unplanned (e.g. fraud, accidental injury, errors) as well as events and outcomes that are wanted and planned for (e.g. sales, profits, cost reductions).
EXAMPLE: A council department responsible for housing repairs had been using performance reports for monitoring but discovered that their information was not giving a true picture. It only showed speed of defined tasks according to strict rules designed to remove the effect of circumstances outside the direct control of the council. When they extended their data gathering to report on all delays and problems not solved at the first attempt they were shocked by the results but quickly able to improve real performance dramatically.
Monitoring is informed by rich information about the organization and its environment, gathered to support monitoring, focused on outcomes of legitimate interest to stakeholders and the factors thought to influence those outcomes, and helpfully presented.
Note 1: A small set of "key performance indicators" is not rich information. If "key performance indicators" are used then they need to be supported by information that provides a greater understanding of those indicators.
Note 2: Helpful presentation often involves time series graphs, ratios, and moving annual totals or averages.
The uncertainty around the accuracy of that information and its interpretation is communicated explicitly and considered.
Note 1: This may be done in a wide variety of ways and does not need to be quantified, though this is helpful where done well.
Note 2: The explanations of uncertainty are more likely to be considered if they are prominent and near to the information to which they relate.
Potential future outcomes are forecast and related to the interests of stakeholders frequently enough that the forecasts remain informative.
Note 1: Forecasts should show the potential events and outcomes rather than just best estimates or averages.
Note 2: The interests of stakeholders may be known from earlier work to expess preference explicitly (e.g. as objectives, targets, budgets, value functions).
Note 3: Rolling forecasts are one good approach.
Note 4: Not all forecasts need to be quantitative.
Uncertainty around those forecasts is expressed and considered.
Note 1: This is achieved in part simply by showing alternative possible futures, e.g. with ranges.
Opportunities for (partially) automated forecasting are considered.
Note 1: One of the easiest and simplest approaches is statistical extrapolation, which is often more accurate (and less biased) than more laborious methods using human judgement. It may be worth checking that human judgement is more informative before making the extra effort to use judgement in forecasting routinely.
Actual events and progress are compared to previous expectations.
Note 1: The difference between expectations and reality helps to draw attention to what may need to change.
Adaptation to changing circumstances and new ideas is thorough, with the understanding of interests, plans, and designs all reviewed and available to change.
EXAMPLE: A large computer project has been proceeding for several months and a number of design changes have been made and passed through a change control process. One technical change is particularly important and controversial, with heated arguments between engineers at the change control meeting. The chairperson for the change control meeting sees the controversy and pushes participants to be honest and open about all the potential implications of making the change and not making it, and about the important uncertainties that remain. The case for change is compelling and the change is agreed, but the chairperson still insists on urgent work to explore the implications of the change further and to search for related technical changes that may follow. The reason for this is that, in the struggle to get the initial change agreed, there may have been a tendency to down-play some of the implications of the change.
Changes to goals, objectives, budgets, limits, etc, and to plans and designs are communicated clearly and promptly.
Effort is made to check that changes have been correctly understood by those to whom they have been communicated.
Hundreds of people receive notification of new publications every month. They include company directors, heads of finance, of internal audit, of risk management, and of internal control, professors, and other influential authors and researchers.