Guidelines for managing uncertainty/risk within planning, design, and decision-making activities
Introduction to the guidelines
These guidelines are to be used when thinking about how to perform planning, design, and decision-making. The guidelines concern only managing uncertainty/risk so they should be considered alongside any other guidelines or suggestions based on other factors.
The guidelines are designed to apply to any situation where planning, design, or decision-making are done at a level where an agreed approach is considered worthwhile. They are worded very carefully to be as widely applicable as possible and that means they are not always as short as they would be if you were just writing for your own organization. Consequently, you may need to read each guideline more than once, carefully, and read any related examples, explanations, and even other documents referenced from this page. A lot of knowledge about how to manage risk has been condensed onto this page.
Usually it will be easy to think of ways to follow the guidelines but sometimes, if your situation is demanding, it may require quite a lot of thought and experimentation. What should not be difficult is justifying the guidelines. As far as possible, these are guidelines that most people think are obvious good sense.
The scope of planning, design, and decision-making activities
Design is a common management activity and includes design work for customers and design of the organization itself. It includes design of premises and work sites, organizational structures, jobs, systems, processes, machines, tools, supply chains, client proposals, products, and services.
Planning is done in various forms in organizations, including annual and longer term strategy making, annual business planning, and project planning.
Design and planning are often combined because designs need to take into consideration the process of realising those designs, just as plans reflect designs.
Within design and planning, many decisions are taken, but many decisions also take place separately.
The approach to planning/design/decision-making should meet the following practice guidelines (in bold):
Effort is made to gather and understand information about the context of the design/plan/decision (including the legitimate, relevant interests of stakeholders) and to use this information to infer properties of good design/plan ideas.
Note 1: Gathering information is an obvious and fundamental step towards managing uncertainty and risk.
Note 2: It is less obvious but still true that design/planning involve what is effectively a search through a vast set of possible designs/plans and there is important uncertainty as to where the good ones are. Using information about the context is a very important way of guiding the search.
Effort is made to assess the significance of what is not known with certainty about the context of the design/plan/decision (which may be facilitated by attempts to predict the performance of designs/plans with uncertainty represented explicitly).
Note 1: Assessing the significance of what is not known with certainty is useful for guiding the search for more information and for directing attention towards designing/planning for a range of possibilities.
Note 2: The most common and best known way to represent uncertainty explicitly within a model is using probability numbers, but this is not the only way to do it.
Information gathering is guided by assessments of the significance of what is not yet known with certainty (which may be facilitated by attempts to predict the performance of designs/plans with uncertainty represented explicitly).
Note 1: This refers to all information gathering relevant to design/planning, not just information about the context.
Note 2: Assessments do not have to be quantitative, even if facilitated by a model that represents uncertainty explicitly.
Efficient methods are used to guide the search for good design/plan ideas so that initial ideas are good and subsequent revisions are usually better still.
EXAMPLE: A government agency responsible for research and development into defence systems must make plans for a wide variety of projects that will unfold over time and produce results that may be hard to predict with certainty. If there are 10 projects that could start now then there are 210 = 1024 different selections that could be made (before considering budget constraints and so on). In practice there will be more projects and they will not all be ready to start now. In addition, some ideas for projects will arise in future in response to discoveries not yet made. The set of potential plans that might be considered is huge and it is impossible to evaluate every plan. Therefore, it is necessary to use rules of thumb that guide the search for promising combinations of projects, bringing some of the more promising combinations to consideration early on.
A plan that creates opportunities to learn and adapt is developed and evaluated.
Note 1: This does not have to be the plan that is actually selected and put into action, but a plan that creates these opportunities should be developed (to some extent) and evaluated. Ideally, evaluation should place some value on these opportunities. The purpose of this guideline is to counter a common bias towards thinking the future is more predictable and controllable than it is, and so to under-value plans that include opportunities to learn and adapt.
EXAMPLE: A company that provides online research services wished to improve the performance of the software that supported the services. It defined its objectives in terms of non-functional performance measures, identifying the current level of performance on each one and the ideal and ‘failing’ levels of performance too. It then checked each design idea against these and the projected costs, and then delivered changes every two weeks, doing the most worthwhile and least costly changes first. Each time new software was delivered its performance against the objectives was measured and monitored. The plans were then modified in light of the actual improvement achieved and customer reactions.
EXAMPLE: A government department aimed to increase greatly the proportion of citizens receiving social security payments direct to their bank accounts, rather than by cheque or cash. What was not known initially was which of the many techniques for influencing the citizens would be most effective, and how much would need to be done to reach the high level of compliance sought. Careful trials were designed, with control groups in some instances. This was combined with focus groups and other research.
Design proceeds in such a way that opportunities to learn by testing ideas (e.g. by simulation, prototyping, focus groups) are created and used.
EXAMPLE: In a series of highly competitive sailing boat races, one team gained an advantage by accelerating the cycle of testing. They simulated alternative designs on computer, then fitted a rapidly manufactured shape to one boat for testing on the water against a second test boat. The crew gave their feedback too, which was easier because the simulation workstations were near the dock. By the first race thousands of ideas had been simulated and some 50 physical changes had been tested on the water. In total, this took 2 minutes off their expected course time.
Opportunities to learn are created with a focus on those uncertainties most important to the design/plan.
EXAMPLE: The Environment Protection Agency’s TRIAD approach to managing pollution clean-up projects involves several powerful elements to deal with uncertainty. One is to develop early on a model of the site and use this to direct data gathering through testing samples and while performing cleaning. In a typical project the pollution is of different types, usually invisible to the naked eye, and often moving around the site due to water movement and wind. Some areas of the site or near it are more important, such as a nearby river or school. Testing is directed towards the most important uncertainties around the model and the model is updated and improved frequently throughout the project.
The potential performance and other consequences of alternative designs/plans under consideration are explored objectively and thoroughly, quantified where appropriate, and expressed with suitable communication of uncertainty.
EXAMPLE: A police force started an ambitious project to create a single query system that would access several of its databases and help officers investigate crime. After some time the project failed because what was attempted was not feasible and a fundamental user requirement had not been understood. Nevertheless, the organization did not give up and instead asked for a new business case for a second attempt at the project. This new business case was developed in an entirely different way, with careful explanations of the significant uncertainties around the business case and a recommended approach that was more focused, better researched, and involved testing key ideas early on.
Note 1: If a plan/design is modified then that creates what is effectively a new plan/design and so the potential consequences of this should be examined. This may help to counter a common fault of making changes to a plan/design prompted by one urgent consideration without thinking through all the other consequences of that change.
Opportunities to evaluate designs/plans more efficiently by automating consideration of at least some possible consequences are considered.
EXAMPLE: For many years now, accountants have been managing cash with the help of cash flow forecasting spreadsheets. More advanced designs can be used to explore the implications of different timings of larger receipts and payments, and so quantify the chances of breaching limits on borrowing. Doing any of this without software would be very laborious.
The potential consequences considered include direct, intended consequences, indirect and unintended consequences, and the consequences should unexpected events not directly influenced by the organization’s actions or design’s performance occur (e.g. bad weather, economic crisis, cheating, user error).
Note 1: This is perhaps the most traditional of all risk management guidelines.
Each attempt to make or revise a design/plan is iterative, typically with detail and improvements added in successive stages, but with the possibility of backtracking too, until a conclusion is reached.
EXAMPLE: Whether deliberate or not, business planning with a forecasting model tends to be iterative. Two elements are likely to be modified as iterations continue. One is the plan itself, which will tend to improve as the details of each forecast are reviewed and inspire changes to the plan. The other is the forecasting model. Just looking at forecasts from a model often reveals implausible predictions and this leads to corrections or expansions of the model, often to improve it where uncertainty is important.
The selection of a design/plan fairly reflects all the outcomes considered possible and all relevant, legitimate interests.
Note 1: This is another fundamental guideline. It is wrong to make a narrow assumption about what will happen and select a plan/design on that basis alone, even if the assumption seems the most likely outcome.
Where possible, designs/plans are reviewed repeatedly over time and adapted or replaced.
The design/plan chosen is clearly documented, communicated, and effort is made to test that the statement of the design/plan is clear and that those who receive the communication have understood it correctly.
Hundreds of people receive notification of new publications every month. They include company directors, heads of finance, of internal audit, of risk management, and of internal control, professors, and other influential authors and researchers.