by David Vose
We like the word ‘professional’ – it gives us a sense of security. We have our accounts prepared by a qualified accountant, we use a lawyer to write a contract, and we go to a doctor for medical advice. So if your business or government agency needed to make a very important, risk-based decision would you make sure that the risk assessment was done by an experienced, trained risk analyst? Or an even bigger question – if you have a corporate risk management process in place, is it designed and maintained by a really good risk analyst?
The answer, unfortunately, is very probably ‘no’.
Why? Because risk analysis is not really recognized as a profession in itself, it is more an extension to another discipline. There are very few people around who can be genuinely described as professional risk analysts, and I doubt that will change very much in the near future. So businesses make do with what they have, which frankly is often not very much at all. I think we can do much better.
The king of risk analysis techniques is quantitative risk analysis (QRA), where one builds a Monte Carlo simulation model (usually in Excel) to describe the uncertainties surrounding what might happen. QRA models have the potential to provide senior decision-makers with extremely valuable information. Information that shows, for example, that a potential investment is far too risky, or that too little is known about a drug for it to become an approved medication, or that a project is very likely to exceed its budget or deadline, etc.
The results of QRA models are routinely employed by businesses and governments to help make huge decisions – decisions worth millions or even billions of dollars, decisions that affect thousands of lives. In the course of my consulting work, I regularly see many such models.
Nearly every risk analysis model I have ever reviewed, and I have reviewed many, falls far short of what I believe to be ‘professional’ quality, some catastrophically so (meaning the results are useless or grossly misleading). Sometimes the model has been built on so many assumptions that its results should have no credibility. Sometimes the risk modeler has not really thought about the essence of the problem, and instead jumped into building a huge, detailed model that he or she is now too invested in to throw away no matter how irrelevant the scope. Sometimes the model is based on such poor data, in terms or quality, relevance or quantity that the risk analysis has no precision; and sometimes the math is just plain wrong. In a previous white paper (The Perplexing Math of Uncertainty), I illustrated some of the very common math mistakes people make in risk modeling when the model is built of nothing more complex than a combination of add, subtract, multiply and divide.
What mostly happens is someone from within the organization who is good with spreadsheets and knows the general topic but has no risk analysis experience is tasked to do a risk analysis, and do it quickly. This is much more the rule than the exception – whether in science, business, or government. If the analyst is lucky, he or she will get some training in how to use the Monte Carlo simulation risk analysis software that their organization has just purchased for them, and maybe they will buy a text book like mine, but that is about all the preparation they will have.
This is just an excerpt from a full white paper
To download this and other white papers please fill in the form below so we can send you the download links of our white papers.