Quantitative Risk Analysis: You Have More Data Than You Think

We usually hear the following amongst new FAIR practitioners, or those that have only conducted qualitative risk assessments:

“I simply do not have enough data to conduct a quantitative analysis.”

These individuals believe in order to conduct a quantitative analysis, it requires an endless stream of highly complex data that can only be crunched and made recognizable by overpriced and unnecessarily complicated algorithms. There are competitors to RiskLens/or companies out there that will make this seem like the case, but it’s simply not so.

One great thing I learned when I was introduced to FAIR was the simple concept outlined by Douglas Hubbard in his seminal book, How to Measure Anything:

  • You have more data than you think you have.
  • You need less of it than you may assume.

Many people new to this process, risk analysts included, get caught up with where to start. When presented with a question that I don’t have a clear and decisive answer for, I work through the following:

Start with an absurd range

By starting with an absurd range I reduce my chances of “anchoring” my estimates to a preconceived, or biased answer. In this fashion, I open my perspectives to previously unconsidered answers.

Reference what I know

What do I know about the question or subject posed that will allow me to dial in the distribution and gain precision in my estimate? It is very rare that I can’t reference some previous experience or piece of external data that will help me with my distribution.

The law of diminishing returns dictates how much effort and resources should be expensed in our efforts to gain greater and greater precision in our inputs

I always keep in mind that we are looking for accuracy with a useful degree of precision when it comes to our estimates. Gaining a good, accurate estimate is often better time spent than the additional hours of training to gain marginal precision (i.e. you need less data than you may assume).

In addition to the concepts outlined above, the FAIR ontology serves as a roadmap to the subject matter experts (SMEs) within your organization that possess many of the data points elicited in an analysis. Who within your organization will know how many attacks you’re experiencing in a given year? Possibly your SOC or Incident Response department. Who within your organization will know the strength of your controls environment? Possibly those that conduct the penetration testing for the organization. Who within your organization knows the anticipated revenue impact from a core application outage? Possibly a member of the business. All of these questions are nodes within the FAIR ontology that require answers. Your job as an analyst is to just go out and find them.

As a RiskLens/FAIR analyst, I use these concepts every day. Whether it be in our training sessions, or onsite with our customers, it hasn’t failed me yet.