Finding efficiency when performing quantitative analyses
A successful and optimized risk analyst team should be looking for two traits: efficiency and consistency. But when it comes to quantitative risk analyses, many analysts believe the efficiency part is the most challenging trait to achieve. We've found that there are two key areas of efficiency to focus on when doing a quantitative risk analysis: process and platform.
When we speak to the risk analysis process, we mean the workflow that occurs from the moment the risk issue or request is fielded to the delivery of the results.
“Scripting out” your process is important. You cannot expect the risk team to operate consistently if the process is ad-hoc. It's important to have the key activities and steps that need to occur clearly defined and provide clear examples of what success looks like for the activity.
Focus is key, but not every type of risk analysis can be treated equally. Often teams are overwhelmed with dozens of small (in scale) assessments related to policy exceptions or audit findings. These often monopolize the risk analyst team’s time while more significant and often more valuable risk assessments, over IT initiatives or emerging threats, are put on the back burner. In order to manage this workload your team should have an optimized process defined for each risk analysis type. We know from interviewing dozens of highly successful programs this is a shared concept. As an example; policy exceptions can go through a risk analysis triage that allows the team to rapidly process all requests. Then the issues that have perceived significance can appropriately be prioritized for a more detailed analysis. This single step of systematically triaging policy exceptions allows the team to focus on the risk issues and assessments that matter. We recognize this need and have a risk triage analysis engine incorporated into RiskCalibrator.
The platform you choose to be the engine that drives your risk analysis should also be designed to optimize risk analysis performance.
We have designed significant performance-related features into our RiskCalibrator platform to optimize your risk analysis performance based on our deep experience with FAIR and quantitative modeling.
A few include:
- Our platform has multiple risk analysis modules that are tailor made for optimizing different types of risk assessments.
- Asset, Threat, and Loss Table Libraries allow lookup of pre-captured data/information in all risk analyses.
- Versioning creates analysis iterations that allow for forecasting loss exposure when one or more factors have changed.
- Authoritative Assets provide the ability to define authoritative data associated with specific assets. This enables workshop data to be pre-filled (as a template) in future analyses.
And the list goes on…
Building a successful risk program requires an analyst team that operates at peak performance. Review your process and platform and ensure both are enabling efficient and consistent risk analysis.
Want to discuss this more? Leave a comment or feel free to contact me.