There's a lot of maths in Bayesian analysis - however Phil's explanatory style did help me begin to deal with it :
"If it walks like a duck and quacks like a duck, is it possible it's not a duck? ... and examines this oft-quoted maxim and describes how Bayes’ theorem of conditional probability makes raw data useful for making business decisions.
- P(E) : Probability of serious Earthquake
- P(T) : Probability of serious Tsunami
- P(NE) : Probability of serious Nuclear Event
- P(Econ Eng) : Probability of using an Economic Approach to Engineering Design &/or Construction= less robust design
- P (Project Cost Drivers) : Probability of Project Cost Drivers being weighted higher than robust Engineering Design risk aspects
- P(Op Mgmt) : Probability that equipment has not been operated, maintained &/or safety/training done correctly
- P(Eng) : Probability of Engineers incapable of controlling serious Nuclear Event
- P(Rc) : Probability of Risk Changes if initial understanding of design, operational maintenance, safety circumstances has altered or if any of these have actually been altered
So even if we did all of the above analysis and it suggested a serious risk of generating a situation which is unlikely to be tolerable - do we dismiss this as pessimistic engineering reasoning then optimistically run this through the lens of Evidence Based Reasoning - ie which runs as something like :
And developing people, communities & organizations to proactively & reactively face complexity with resilience rather than denial ?
Simplifying where it is possible ?
ASQ's Paul Borawski recently challenged us as ASQ Global Influential Voices for Quality to reflect on What is the Future of Quality?
Is this perhaps the future direction for quality management a future ISO 9001:2015 - to be more upfront about more aggressively & openly addressing Risk ?