« Return to Blog Listing

Measuring Aggregate Risk Analysis

by Jack Jones on Feb 10, 2016 11:18:52 AM

One of the questions I commonly encounter is "How do you take something like FAIR and apply it to a big problem, like measuring the aggregate risk within an entire organization?"

Measuring the surface area of Long Island

Imagine that you’ve been given the task of determining the surface area of Long Island. How are you going to go about it? One way would be to use a satellite photo like the one below and make a rough estimate based on the level of available detail and an appropriate scale.

Picture1.png

With that information you’d probably report that the surface area is between X and Y square miles, with Z % of certainty. If this level of precision isn’t good enough, you might want to carve the area into chunks, zoom in, measure each chunk, and then add them all up.

If even more precision is needed, you can carve the landscape into finer chunks... 

Picture2.png

Picture3.png

 You could also take surveying equipment and go about the process of making measurements on the ground.

Extrapolating these approaches, it’s conceivable (given sufficient time and tools) that you could try to measure the sub-atomic particles that make up the atoms, molecules, and objects that constitute the island’s surface area. Let me know how that goes.

Regardless of your level of abstraction and your approach, two things are clear:

  1. There is a point of diminishing returns when it comes to measurement precision, and that point is found at the balance between how the measurement is going to be used and the cost/effort in making the measurement
  2. There is always some degree of uncertainty and imprecision. Even at a sub-atomic level of measurement, the effects of erosion and other dynamic processes guarantee that by the time you’ve finished measuring, the subject being measured will have changed

Measuring aggregate risk

In tackling the problem of aggregate risk, we can likewise carve up the landscape into chunks that are measurable and meaningful given our time and resources. One example might be to define a set of object groups such as:

  • Network devices
  • Network transmission media-servers
  • Personal systems
  • Applications
  • Printed media, etc…

Harkening back to our notion of abstraction, you could carve the landscape even finer. Perhaps, for example, defining subsets of servers based on operating system, function, location, line of business, etc. Bottom line -- you’ll want to define a level of abstraction that provides useful information given the available time and resources.

In order to perform a risk analysis, we also have to define our threat landscape. This, too, needs to be defined at an appropriate level of abstraction. For example, a high level breakdown might look like:

  • External hackers
  • Disgruntled employees
  • Contractors, etc…

You can also define threats that aren’t malicious, such as....

  • Acts of nature
  • Weather
  • Geological disturbances
  • Extraterrestrial (No - not ET. I mean things like solar flares, etc.)
  • Animals (non-humans)
  • Errors & failures
  • Employees
  • Service providers
  • Suppliers

 ... and/or become even more granular in your definitions.

With your landscape defined, you’re in a position to begin a series of high-level FAIR analyses.

Things to keep in mind

It doesn’t matter what level of "elevation" you’re operating from, all measurements are estimates, which means there’s always some degree of uncertainty and error. Therefore, the question isn’t whether uncertainty and imprecision exists in a measurement, it’s a question of whether the degree of certainty and precision is acceptable.

The acceptability of a measurement isn’t for the measurer to decide. It’s up to the people who are making decisions based on the information. What’s important is that they understand the degree of (un)certainty and (im)precision in the measurements being provided.

Precision is often (always?) a function of size, complexity, how dynamic the problem is, the quality of available tools and methods, and the time spent measuring. In other words, the more precision desired, the more time/money that’s going to have to be spent.

Accuracy and precision aren’t the same thing. I can be precise and not be accurate. For example an estimate that my 2016 income is going to be $2,352,004.32 is highly precise. Unfortunately, it’s not even close to being accurate (unless a miracle occurs). Conversely, an estimate that my income will be between $50k and $500k isn’t very precise, but it’s likely to be accurate. In an ideal world, we could be accurate and highly precise. In the real world, at least when measuring risk, you want to be accurate and have an acceptable degree of precision.

The bottom line

When performing an aggregate risk analysis, start at a relatively high level of abstraction and let the results from that analysis guide you regarding where to dive deeper. This helps to ensure that you find a feasible level of precision while managing your time and effort effectively. 

Schedule a Demo
This post was written by Jack Jones

Jack Jones is the EVP of R&D and a Founder of RiskLens. He has worked in technology for over 30 years, the past 28 years in information security and risk management. He has a decade of experience as a Chief Information Security Officer (CISO) with three different companies, including a Fortune 100 financial services company. His work there was recognized in 2006 when he received the Information Systems Security Association (ISSA) Excellence in the Field of Security Practices award. In 2007, he was selected as a finalist for the Information Security Executive of the Year, Central United States, and in 2012, he was honored with the CSO Compass Award for leadership in risk management. Jones, who lives in Spokane, Washington, has served on the ISACA CRISC Certification Committee and RiskIT Task Force, as well as the ISC2 Ethics Committee. He is the author and creator of the Factor Analysis of Information Risk (FAIR) framework. He writes about that system in his book Measuring and Managing Information Risk: A FAIR Approach, which was inducted into the Cyber Security Canon in 2016, as a must-read in the profession.

Connect with Jack