|
Working In Uncertainty (WIU)
Relevant authoritative guidance
The most authoritative, technically competent, and influential guidance and regulations on how to deal with uncertainty (and so manage risk) are not to be found by searching the internet for 'risk management'. This is because that authoritative guidance is within other documents covering such things as decision-making and forecasting.
Search the internet for material on 'risk management' and you will get links to ISO 31000, COSO's ERM framework, and various others, all apparently describing a separate management system involving listing risks. (In fact some are trying to describe an approach integrated into management.) This material tends to be relatively short and is almost always nothing more than guidance.
On this page I list some of the most important documents in various areas, and describe their contents and approach. Here are some general observations about the documents listed:
Not generic and yet generic: The documents have almost always been written for a particular area of application (e.g. health, banking, space mission safety). However, a surprisingly large amount of the thinking within them can easily be applied more widely.
Importance: Most of the documents address uncertainty/risk in areas where the stakes are high and the challenges are complex, e.g. nuclear meltdown, the stability of the world banking system, the safety of astronauts, the health of millions of people and animals.
Status: Many of the documents are part of a regulatory system, often providing guidance on how to comply with requirements, or providing the requirements themselves.
Rigour: Because of the challenges and high stakes, the level of rigour promoted, and the skill and work needed, tend to be much higher than is usually required in everyday life and work. This can create the entirely false impression that dealing with uncertainty well must always involve a huge effort by highly skilled people. Remember that the 431 page guide to quantitative risk analysis produced by NASA is, literally, written for rocket scientists.
Completeness: My list is certainly not complete. If you have any suggestions for documents to add please get in touch.
Climate
Predictions about the world's climate are highly uncertain predictions feeding into one of the most important and complicated decision making activities on the planet today. Most work in this area is done by scientists so it is no surprise that the methods recommended are those of science.
Document | Summary/comments |
Guidance Notes for Lead Authors of the IPCC Fifth Assessment Report on Consistent Treatment of Uncertainties
Issued by Intergovernmental Panel on Climate Change (IPCC) in 2010 to guide authors contributing to its fifth report on future climate change predictions.
A total of 4 pages, all dedicated to uncertainty.
|
The approach communicates uncertainty through information about probability distributions and information about evidence used. Authors are encouraged to provide information on the tails of distributions in particular since these are important to decisions and yet likely to be particularly uncertain. A table for evaluating evidence breaks it down into the quantity of evidence and the extent of agreement in that evidence. There is also a table defining probability phrases with precise numerical ranges.
|
Good Practice Guidance and Uncertainty Management in National Greenhouse Gas Inventories
Issued by Intergovernmental Panel on Climate Change (IPCC) and then updated in 2001.
Seven main chapters and three relevant appendices totalling 493 pages dedicated to uncertainty.
|
Most of the material relates to uncertainties around particular variables used in climate modelling. However, there are also excellent chapters on the theory involved, a chapter about practical procedures, and a useful glossary of terms. A lot of the material is on quantifying measurement uncertainty and then aggregating the effects using some method of propagating uncertainty through a model.
|
Best Practice Approaches for Characterizing, Communicating, and Incorporating Scientific Uncertainty in Climate Decision Making
Issued by the U.S. Climate Change Science Program as Synthesis and Assessment Product 5.2 in 2009. Many authors, but the lead author was M Granger Morgan, Department of Engineering and Public Policy, Carnegie Mellon University.
All 94 pages are devoted to uncertainty.
|
Downloading this guidance is getting a fantastic book by leading thinkers for free. It covers all the main aspects of uncertainty in decision making and is easy to read as well as being authoritative. Chapters cover: Sources and Types of Uncertainty, The Importance of Quantifying Uncertainty, Cognitive Challenges in Estimating Uncertainty, Statistical Methods and Models, Methods for Estimating Uncertainty, Propagation and Analysis of Uncertainty, Making Decisions in the Face of Uncertainty, Communicating Uncertainty, and Some Simple Guidance for Researchers.
|
Guidelines for the Preparation of Dispersion Modelling Assessments for Compliance with Regulatory Requirements – an Update to the 1995 Royal Meteorological Society Guidance
Issued by the UK Atmospheric Dispersion Modelling Liaison Committee in 2004. Guidelines prepared by M P Ireland (Mott MacDonald, Co-ordinator of DMUG), J A Jones (HPA-RPD, Chairman of ADMLC), R F Griffiths (University of Manchester), B Ng (Environment Agency), and N Nelson (Defra).
Within this 28 page document, section 7 on "Sensitivity, uncertainty and variability" (4 pages long), section 8 on "Quality assurance" (4 pages long), and section 9 on "Auditability" (1 page) are particularly related to working with uncertainty.
|
The guidance includes its own definitions of the terms 'sensitivity', 'uncertainty', and 'variability' and recommends some well known statistical and modelling methods for calculating and displaying them. The material on quality assurance recommends some typical methods for finding mistakes and assessing a forecast's reliability.
|
Pollution
A lot of the work on analysing pollution tries to work out the dangers of particular types of pollution and sometimes the costs and benefits of taking action to reduce it. The impacts of pollution include illness in people, such as cancer, and wider environmental impact.
A lot of work on 'risk analysis' by people who are 'risk analysts' falls within this area. Most of these people have a scientific background.
Document | Summary/comments |
Guidance on the Development, Evaluation, and Application of Environmental Models
Issued by the United States Environmental Protection Agency (EPA) in 2009.
Within the 99 pages of the document issues related to uncertainty arise frequently. Bear in mind that the phenomena being modelled are usually frequencies of harm (e.g. rate of cancer deaths) represented as probabilities, so when the document talks about 'uncertainty' it is referring to uncertainty as to the frequencies involved. Most people would see the entire modelling exercise as 'risk modelling'.
|
This guidance reflects the modern trend towards environmental risk analysis that supports decision making about what, if anything, to do differently. This means that analysts are sometimes required to go beyond their traditional strong areas of health and pollution and consider also the actions that might be taken to reduce pollution, which themselves may have many uncertain aspects.
There is advice on every stage of devising and using models in decision support, including lots of advice on assessing models.
Most of this advice is applicable to any decision making effort and readers need no knowledge of environmental science to understand it.
|
A Framework for the Economic Assessment of Ecological Benefits
Issued by the United States Environmental Protection Agency (EPA) in 2002.
This 185 page document has references to uncertainty at appropriate places.
|
The main theme of this document is defining and valuing important benefits, but uncertainty is mentioned at appropriate points. For example, in planning a modelling exercise the approach to uncertainty should be considered.
|
Science Policy Council Handbook: Risk Characterisation
Issued by the United States Environmental Protection Agency (EPA) in 2000. Principal authors, John R. Fowle III, Ph.D. (Science Advisory Board Office of the Administrator) and Kerry L. Dearfield, Ph.D. (Office of Science Policy, Office of Research and Development).
All 189 pages of this document are relevant to working in uncertainty.
|
This is a very detailed, but clear and helpful document about how to 'characterize' the chances of something bad that might happen. The difference between characterization, analysis, and communication are explained. There are illustrative case studies. Specific techniques for characterization are explained.
The main concern of this document is analysis of health and ecological risks for regulatory purposes, but most of its advice is applicable far more widely.
|
Framework for Cumulative Risk Assessment
Issued by the United States Environmental Protection Agency (EPA) in 2003 as a framework for developing future guidance and requirements.
All 129 pages are relevant to working with uncertainty.
|
This document is about how to assess health and ecological danger from multiple sources. In other words, it is about risk aggregation. The framework goes through each of the major phases of analysis, systematically. As with most EPA documents, the references to 'uncertainty' relate to uncertainty about frequencies, predominantly.
The framework does not consider 'adding up' the risks to be cumulative risk assessment. The approach is describes involves understanding possible interactions and thinking about them clearly. Its principles apply whether this is done quantitatively or not.
|
Policy for use of Probabilistic Analysis in Risk Assessment
Issued by the United States Environmental Protection Agency (EPA) in 1997 to establish conditions for acceptance of probabilistic analyses.
The 4 pages are all relevant to uncertainty (remembering that the EPA is usually operating within a 'certainty about frequency' approach).
|
The requirements repeat the usual themes about full explanations, especially where there is uncertainty.
|
Guiding Principles for Monte Carlo Analysis
Issued by the United States Environmental Protection Agency (EPA) in 1997 to help promote analysis of variability and uncertainty in risk analyses.
All 39 pages are relevant.
|
The guidance goes through the steps of risk analysis, giving advice along the way. Familiar themes crop up, such as cost-effectiveness of analyses, the need for full explanations, and some technical finer points.
|
Health
There is a certain amount of overlap between health and pollution, but these documents seems to be more about health.
Document | Summary/comments |
Review of guidelines for good practice in decision-analytic modelling in health technology assessment
Issued by the UK's NHS R & D Health Technology Assessment Programme in 2004. Written by Z Philips, L Ginnelly, M Sculpher, K Claxton,
S Golder, R Riemsma, N Woolacott, and J Glanville
This 179 page document is entirely relevant. Its references to uncertainty are usually uncertainty about frequencies, with frequencies represented often as probabilities too. Unfortunately, only the executive summary is now freely available to everyone.
|
This document is a very thorough review of many guidelines from different sources on decision making about health, on the basis of which a synthesized set of guidelines is provided. Sensitivity analysis and probabilistic modelling are recommended with some interesting details on what to focus on. There is good advice throughout.
|
Improvement of Risk Assessment in View of the Needs of Risk Managers and Policy Makers (Preliminary opinion approved for public consultation the expected input of which will lead to changes in the final text of the opinion.)
Issued by the European Commission Health & Consumer Protection Directorate General in 2011.
All 75 pages are relevant to management under uncertainty.
|
This document sets out guidance on risk analysis and decision support. The idea is that risk analysis should support decision making about improvement actions. There is quite a lot of discussion that is lengthy but not particularly helpful directly in the main body of the document, but the appendices make up for that. There's a nice case study about swimming in polluted water that illustrates the integrated approach.
|
Law
Uncertainty is central to court cases and there are situations where experts provide information based on probabilistic reasoning. Mistakes have been made, including mistakes by judges. Hence, the need for guidance.
Document | Summary/comments |
Communicating and Interpreting Statistical Evidence in the Administration of Criminal Justice, PRACTITIONER GUIDE NO 1 Fundamentals of Probability and Statistical Evidence in Criminal Proceedings: Guidance for Judges, Lawyers, Forensic Scientists and Expert Witnesses
Issued by the Royal Statistical Society’s Working Group on Statistics and the Law in 2010. Written by Colin Aitken (Professor of Forensic Statistics, University of Edinburgh), Paul Roberts (Professor of Criminal Jurisprudence, University of Nottingham), and Graham Jackson (Professor of Forensic Science, Abertay University). There are now three more volumes on this topic.
All 122 pages are relevant.
|
The guidance has good coverage of basic principles, including Bayesian reasoning, which is one area where mistakes have been made in the past. Section 3 lists and explains 'traps for the unwary'. Even if you are not involved in court cases this is very useful reading about how to work out what probably happened, given evidence.
|
Banking and insurance
The financial world periodically throws up high profile fiascos and one cause is often the overconfidence of money men and the 'quants' who support them. In response, industry regulators have tried to encourage a more competent and diligent approach.
One peculiar aspect of the guidance in these areas is that it is a hybrid of ideas and terminology inspired by both management science and the Risk Listing ideas promoted by Big 4 external audit firms. This means that, alongside solid guidance on modelling and decision making, there are some illogical requirements and turns of phrase. It often feels as if a mathematically illiterate auditor is writing rules for mathematicians and scientists.
Also, many economists follow a strictly frequentist view of probability and are included to ignore uncertainty about frequencies. This is another weakness to be wary of when reading the guidance.
Document | Summary/comments |
Solvency II Detailed guidance notes, March 2010, Section 2 - model scope, governance and use
Issued by Lloyds (the London insurance market) in 2010.
This 29 page document is all relevant, being a document about using probabilistic models for business planning purposes.
|
This document is one of many that aim to help insurance companies comply with Solvency II, a set of regulations designed to ensure that they have enough money to pay out on insurance claims, even if things turn out worse than they expect. Central to this is the requirement to think about, and calculate, how much money they need. This involves forecasting, using models, to predict how things could turn out in future. The regulators would like companies to use the forecasting models for their own business decisions, not just for regulatory compliance.
So, not only does this document have guidance on the creation and governance of models, but it also has lists of business decisions where the models could be used.
|
Solvency II Detailed guidance notes March 2010 Section 4 - statistical quality standards
Issued by Lloyds (the London insurance market) in 2010.
This 24 page document is all relevant.
|
This lists requirements on models, but the requirements are often idealistic or vague and this seems to be because of a directive quoted in the document. For example, according to the directive the information used has to be 'credible' and assumptions have to be 'realistic'. These are not the sort of words one can easily audit to. There are also some surprising references to risk 'ranking' - the sort of thing advocated for risk registers.
|
Solvency II Detailed guidance notes March 2010 Section 5 - calibration, validation and profit & loss attribution
Issued by Lloyds (the London insurance market) in 2010.
This 28 page document is all relevant.
|
Despite the reference to 'calibration' there is no material in this document about calibration of probabilities in the usual sense. There are just vague and idealistic references to testing models against all relevant experience. There are, however, lots of requirements of things to check and some are good ideas.
|
Conceptual Framework for Technical Actuarial Standards
Issued by The Financial Reporting Council's Board for Actuarial Standards (in the UK) in 2008.
This 23 page document contains about 2 pages of material that is relevant to uncertainty, under the headings of 'uncertainty', 'risks', and 'assumptions'.
|
Conceptually, I think there are some problems here, but the basic position on uncertainty is clear. One of their four fundamental principles is that if an actuary does a calculation and there is any uncertainty involved then that uncertainty has to be communicated to users. The wording they us is this: "Actuarial information cannot be regarded as complete unless it includes an indication of any uncertainty inherent in the information."
|
Technical Actuarial Standard D: Data
Issued by The Financial Reporting Council's Board for Actuarial Standards (in the UK) in 2009.
This 10 page document contains about 2 pages of material that is relevant.
|
This is straightforward advice on checking and remediating data used as input to actuarial work.
|
Technical Actuarial Standard M: Modelling
Issued by The Financial Reporting Council's Board for Actuarial Standards (in the UK) in 2010.
This 18 page document contains surprisingly little of direct relevance but there is material on assumptions and on different types of estimate.
|
One reason that actuarial work often relies on bundles of assumptions is that calculations based on documented assumptions are what is legally required.
|
Technical Actuarial Standard R: Reporting
Issued by The Financial Reporting Council's Board for Actuarial Standards (in the UK) in 2010.
References to uncertainty (and risk) are scattered within this 15 page document.
|
The main requirement is to explain and characterize material uncertainty and techniques are listed for doing this. The difficult areas not explained very well in this document are the obvious overlaps between uncertainty, risks, and assumptions.
|
Operational Risk – Supervisory Guidelines for the Advanced Measurement Approaches
Issued by the Basel Committee on Banking Supervision in 2011.
This 63 page document is all relevant.
|
Most of the material is detailed guidance on how to build models giving probabilistic forecasts of the total financial impact of operational 'risk events' over a one year period. The regulators have been monitoring the techniques banks have been using in their models and updating the guidance so that it responds to good and bad practices noticed.
|
Section 23.609 of Customer Clearing Documentation, Timing of Acceptance for Clearing, and Clearing Member Risk Management
A rule by the Commodity Futures Trading Commission (in the USA) in 2012.
Section 23.609 is relevant.
|
The rules require certain companies who deal in swaps to do a long list of specific checks on their positions, usually frequently. While the rules do not say that the various evaluations and tests need to be in quantitative terms and involve calculations it is obvious that they will.
|
Policy
Government policy making is another area where huge uncertainty makes important decisions difficult.
Document | Summary/comments |
The Green Book: Appraisal and Evaluation in Central Government
Issued by HM Treasury in 2003, and updated in 2011.
This 118 page document contains roughly 15 pages of material specific to risk and uncertainty in Chapter 5 and Annex 4.
|
This document is about how the UK government should approach decisions. It covers familiar topics like objectives, developing alternative courses of action, predicting and valuing the results of each, making a choice, implementing it, and evaluating the results. There is material on uncertainty specifically within Chapter 5 and Appendix 4.
This material is a mix of management science with some Risk Listing under the heading of 'risk management'. This Risk Listing material is presumably in deference to the Orange Book, also produced by HM Treasury, which is pure Risk Listing.
|
The Magenta Book: Guidance for Evaluation
Issued by HM Treasury in 2011.
This 141 page document has relevant advice on risk and uncertainty scattered throughout.
|
This document is about how the UK government should approach evaluations of policies and covers evaluation once policies are implemented and designing policies and their implementation to facilitate evaluation. I particularly like its messages on incrementally delivering policies.
|
IA Toolkit: How to do an Impact Assessment
Issued by HM Government in 2011 to tell departments how to analyse the impacts of proposed policy changes. These assessments are required and must be published.
This 48 page document contains relevant material at paragraphs 107 - 111, and also scattered throughout the document in paragraphs 16, 18, 27, and 104.
|
This document gives simple, sensible advice on dealing with uncertainties around impact analyses. The form for showing headline financials calls for high and low estimates and the final ministerial sign off (theoretically) requires the minister to take into consideration the uncertainty around the assessment. (Having skimmed a few recent impact assessments it is hard to say if the guidance has been followed in a way that usually helps. The assessments are complex documents and many of the calculations are presented only as summaries.)
|
The AQuA Book: guidance on producing quality analysis for government
Issued by HM Treasury in 2015 in response to an embarrassing incident involving a model that was wrong, this guide is about how to produce models for decision-making purposes.
This 70 page document contains relevant material throughout, but has two chapters in particular focusing on uncertainty. Chapter 5 is on uncertainty and has 3 pages on the importance and implications of uncertainty. Chapter 8 then has 10 pages on analysing uncertainty.
|
Most of the guidance is fairly high level and conventional. Unfortunately, there are no mentions of Bayesianism and there is a rather artificial distinction between risk and uncertainty. It's not perfect, but this is another step generally in the right direction for the UK government.
|
DfT analytical assurance framework: strength in numbers
Issued by the Department for Transport in 2014 in response to that embarrassing incident involving a model that was wrong, this collection of guides is about how to produce and use models for decision-making purposes.
The document with guidance on quality assurance is 33 pages long, with mentions of risk and uncertainty throughout.
|
Like the AQuA book, this suffers from a lack of Bayesian thinking and consequently an artificial distinction between risk and uncertainty. Nevertheless, most of guidance is good, solid stuff and useful for important decision-support models.
|
Circular A-4, subject: Regulatory Analysis
Issued by the Office of Management and Budget (of the USA) in 2003 to guide Federal agencies in developing regulatory analyses (impact assessments).
This 48 page document contains 4 pages of directly relevant material and also many mentions of uncertainty at appropriate points throughout the document.
|
Compared to the UK document above, this goes into more detail about how to deal with uncertainty in assessments.
|
Safety
Scientific approaches to safety have developed over the past few decades. An early success story was in nuclear safety, where probabilistic models capable of aggregating lots of events revealed that smaller incidents were more likely than most people thought. Up to that time many people had thought it sensible to try to manage individual worries down to such a low level that they could each then be ignored completely in further analysis. What proper modelling revealed was that many potential failures, supposedly each managed down to a negligible level, can add up to a big danger when they are in a complex system where everything has to be working.
Scientific measurement
This is something that comes up in many contexts.
Document | Summary/comments |
Measurement Good Practice Guide 11 (issue 2) A Beginner's Guide to Uncertainty of Measurement
Issued by the National Physical Laboratory (of the UK) in 2001. Written by Stephanie Bell from their Centre for Basic, Thermal and Length Metrology.
This 41 page document is entirely relevant.
|
"A measurement result is only complete if it is accompanied by a statement of the uncertainty in the measurement." Good advice, and backed up with lots of detail on how to use statistical methods to do it.
|
Presenting statistics
This too is something that comes up in many contexts.
Document | Summary/comments |
Communicating uncertainty and change: Guidance for official statistics producers
Issued by the Government Statistical Service (of the UK) in 2014.
This 17 page document covers communicating change as well as communicating uncertainty, and some of the material is about communicating uncertainty about change.
|
This is simple advice, nicely illustrated by some examples that are analysed in detail to highlight the presentation techniques used.
|
Made in England
|
|