About three years ago I attended one of the ACTKM annual Knowledge Management Conferences. The 2008 conference covered a huge range of items, including Bayesian Networks. I'd parked that in the bottom of the memory banks to be honest - until I encountered a PEX article on Baye by Phil Mercy who does Six Sigma consulting with Motorola.
I started to reflect on how Baye's approach could be seen through a lens of Risk, Complexity & Human Errors - key issues in the Future of Quality - a subject raised by ASQ's Paul Borawski (see his recent post). And so follows the output of my learning journey into this maze ...
There's a lot of maths in Bayesian analysis - however Phil's explanatory style did help me begin to deal with it :
Phil starts out with catching birds in a trap and determining if they are ducks or not :
"If it walks like a duck and quacks like a duck, is it possible it's not a duck? ... and examines this oft-quoted maxim and describes how Bayes’ theorem of conditional probability makes raw data useful for making business decisions.
“If it walks like a duck, and quacks like a duck, then it’s a duck.”This oft quoted maxim is intuitively ‘Correct’ and accurately describes the human reasoning process. Evidence about an object, in this case whether it waddles or quacks, is used to help determine the nature of that object, i.e. whether it’s a duck or not. When the weight of evidence builds up in favour of any single outcome, then a human will deduce that this result is the correct one. “If it walks like a duck, and quacks like a duck, then it’s a duck”.This seems ad hoc, and not analytically sound, but in practice this method works really well to guide our day to day decisions.
After all, when it comes down to it, we’re all in the business of turning raw data into a correct decision for our business.Luckily for us analytical types, there is a Mathematical formalism for this technique: Bayes’ theorem of conditional probability.The probability of an event A occurring is changed if we know something about a related event B
P(A|B) = P(B|A).P(A)/P(B)… and in English - The probability of A, given that B has occurred, is the probability of B, given that A has occurred, times the probability of A, all over the probability of BIf we know that event A normally occurs when event B has already occurred, then knowing something about B may well change your view of A. For complex systems with multiple events A,B,C … etc. being considered, a Bayesian Belief Network is often used to model the likelihood of an outcome. You’ll find Bayes used in a number of high technology areas such as complex risk analysis, data mining, machine data learning, artificial intelligence and language recognition.He then goes on to describing determing the likelihood of a bird being a duck, or not, with the use of a Quackometer & Waddleometer ... By using both types of evidence we’ve improved our success rate and now only 5/100 decisions are wrong.
Of course such reasoning may have been applied to the likelihood of their being Black Swans.ie as in European thinking several centuries ago when they referred to impossible events "as being as likely as a Black Swan". All such reasoning tipped on its head when European explorers came to Australia as explained by Nasim Taleb in his book on Black Swans & how resilient organizations prepare for the seemingly impossible.
It did seem to me that the Bayes Concepts could be useful for analysing probability, risk & consequences in Decisionmaking. It aligned with questions by ASQ's Paul Borawski in his A View from Q blog post on Quality & Disasters.
Consider an analysis in Japan of risk of a bad event affecting the population & economy :
- P(E) : Probability of serious Earthquake
- P(T) : Probability of serious Tsunami
- P(NE) : Probability of serious Nuclear Event
if we reviewed each of these independently then perhaps the probability might be low
but if we collectively factor in other key aspects, then we might alter our estimation of the risk of such a serious event occurring :
- P(EqT) : Probability that our understanding of Earthquake risk is inaccurate &/or inadequate
- P(Econ Eng) : Probability of using an Economic Approach to Engineering Design &/or Construction= less robust design
- P (Project Cost Drivers) : Probability of Project Cost Drivers being weighted higher than robust Engineering Design risk aspects
- P(Op Mgmt) : Probability that equipment has not been operated, maintained &/or safety/training done correctly
- P(Eng) : Probability of Engineers incapable of controlling serious Nuclear Event
- P(Rc) : Probability of Risk Changes if initial understanding of design, operational maintenance, safety circumstances has altered or if any of these have actually been altered
So even if we did all of the above analysis and it suggested a serious risk of generating a situation which is unlikely to be tolerable - do we dismiss this as pessimistic engineering reasoning then optimistically run this through the lens of Evidence Based Reasoning - ie which runs as something like :
as we haven't seen any problems in so many years in recent memory, so
therefore we reckon it will be okay &
we'll deal with the consequences if & when they ever arise ?
And then over time, when such cataclysmic events have not occurred, do we begin to believe that the approaches taken years before to deal with the risk prevention & response are okay,
even though they not been tested in fact
ie as such serious situations have not actually arisen to verify that they are robust enough
so we delude ourselves into believing that these measures have worked so far & thus have made make us invulnerable ? And so we do not review the analysis on which all this is based ? sort of like Churchill's belief in Fortress Singapore ?
And is it then still considered as unlikely as 17th Century Europeans thought Black Swans ? Hard to make sense of ? So we do not even prepare for responding to such serious scenarios ? And if we had prepared - would it mean a huge number of incredibly long & complex procedures that go largely unread ? And what of the failure to recognise the magnitude of the cataclysmic events even as they are beginning to unfold? What seems to me too often to be the Myers Briggs Type ISTJ or ESTJ factor among some technocrats & government bureaucrats ? Reassure at all costs that everything is in control - New Scientist 9 April 2011 p 10.
And then even if you did recognise the risk of this calamity and responded - what if there were other risk situations in your organization or community - and to deal with preventing the calamity, you in fact effectively starved those other risks of the organizational resources & managerial attention that they needed to be adequately addressed ? Even worse if there has been a culture of assuring management that these other risk areas do exist but are effectively under control - sort of Emperor's New Clothes Syndrome ?
You could probably swap the above earthquake scenario with any of the following ?
- GFC - previous erroneous view that economic world consists largely of simple independent transaction markets - New Scientist October 28 2008
- 2004 Tsunami - our understanding of subduction earthquake behaviour is evolving - New Scientist 23 April 2011 p6
- 2011 Christchurch Earthquake - was not previously identified as an earthquake zone - New Scientist 26 Feb 2011 p 4
- 2010 BP Gulf Disaster
- 2010 Toyota Recall Crisis
- 2011 - Queensland Floods - Wivenhoe Dam - Inquiry told that predicted rainfall events not factored into Dam Operations - even in Flood ? No major review of Operations Manual since 1985 ?
- 2009 Victorian Bushfires -Inquiry Report - Planning & Response deficiencies
- September 11 2001
And developing people, communities & organizations to proactively & reactively face complexity with resilience rather than denial ?
Simplifying where it is possible ?
I've seen Professor James Reason's Human Errors Swiss Cheese Model on multifactors in causing serious events and perhaps it is a mental model to deal with complexity ? refer DSTO report ' A Review of Accident Modelling Approaches for Complex Critical Sociotechnical Systems"
And perhaps by moving into the Realms of Complexity Thinking ? People such as Bruce Waltuck aka @ Complexified (see his Future of Quality post) and Nick Milton are seeing the overlaps of complexity, quality & information/knowledge transfer - management ...
ASQ's Paul Borawski recently challenged us as ASQ Global Influential Voices for Quality to reflect on What is the Future of Quality?
Is this perhaps the future direction for quality management a future ISO 9001:2015 - to be more upfront about more aggressively & openly addressing Risk ?
1 comment:
Thanks for the post Kerry Anne.
It will take a VERY large bottle of Scotch to address all the issues you touch on; but let me commence the process..
Bayesian Nets and Maths work well provided you are asking the right questions.
With Fukishima, rather than consider the combination of earthquake and tsunami; why not just consider the risk of loss of power. Then add the risk of losing Backup power. Then add prevention of access... all, without considering the reasons for those.
That would have given the operators a framework response concentrating on the areas they could control, with the success / failure rate of predicting tsunami or earthquake reduced to a secondary role.
Post a Comment