By: Jack Jones
One of the concerns that people express regarding quantitative analysis is that, too often, people attribute undue credibility to any sort of quantitative analysis. “If it’s in a spreadsheet, it must be so.” So, should all quantitative analyses be condemned as a result? Hardly. That’s clearly throwing the baby out with the bathwater. It should, however, make us careful. Careful is good. Careful causes us to ask things like, “Where did the numbers used in the analysis come from?” and “What model and assumptions were used in the analysis?” If the answers to these questions can be explained and rationally defended, then there’s no credible reason I can think of to not use quantitative analyses to support decision making.
Do people screw up quantitative analyses? Yep, absolutely. Of course people screw up qualitative analyses too. Frequently, in fact, based on many of the ones that I’ve seen. The lack of critical thinking or basic understanding of risk that I’ve encountered in many qualitative analyses is remarkable. Unfortunately, when people screw up qualitative analysis it’s often not recognized because people just don’t bother to question what went into them. I’d argue that this is every bit as bad an outcome as when people put too much faith in quantitative analyses