How I Analyzed the Top 10 Cybersecurity Risks for a Financial Institution (a Deep Dive)

August 21, 2019  Cody Whelan

For a few weeks now in blog posts, I’ve referenced the Top 10 cybersecurity risk analysis I conducted with RiskLens for a financial institution customer (see  In a Top 10 Risks Analysis, Get These Two Factors Right).  Now that anticipation is at a peak, I figured I would provide my overview, along with some key insights to each risk-analysis scenario.


RiskLens risk analyses use the FAIR model (that's Factor Analysis of Information Risk).   See it here.


Most Top 10 cyber risk scenarios are heavily focused on data breach scenarios (i.e. Confidentiality), but this customer’s checklist of potential risks had concerns spanning the whole C-I-A triad (Confidentiality-Integrity-Availability), which allowed them a more diversified perspective.  With that in mind, I grouped the scenarios to their corresponding C-I-A loss type.

 

Key Takeaways:
  1. This is an example of an organization's perceived Top 10 concerns; it's not until we analyze and quantify the scenarios do we learn which of the 10 are true risks to the organization.
  2. A few of the scenarios understandably "flexed" the impact (how bad it could be) side of the FAIR model without considering the frequency (realistically, how often it would occur) side of the ontology.  Remember, you need both when conducting a FAIR analysis.
  3. Where there is limited to no visibility around key data points, don't be afraid to leverage your calibration skills, as well as sound, reliable industry data (Verizon DBIR, Imperva Web Application Attack Report, etc.)

Confidentiality
Risk #1 Data Exfiltration by an Insider

Overview

During scoping, we discussed that one of the easiest, and most lucrative spots for a malicious insider to steal information would be off a file server.  We took the approach of breaking down file servers into the three data types that were most concerning to the customer (i.e. Personally Identiable Information [PII],  Contractual, and Corporate Sensitive Data) to account for differences in frequency of loss events, and a lack of visibility in certain data types (i.e. Contractual and Corporate Sensitive Data) leaving the organization.

Findings
The PII file server appreared as the most at risk for this kind of data breach by our standard measure, Annualized Loss Exposure (based on frequency of incidents X magnitude or size of each incident over a year).  This was due to the volume of records accessible to insiders. We also made calibrated estimates associated with the other two data types based on access and records tied to their customer base.


Learn more: What Exactly Is Loss Exposure?


Risk #2 Data Breach from Misdirected Email

Overview
We took a similar approach to the previous scenario and broke mail servers into the three data types most concerning to the customer (i.e. PII, Contractual, and Corporate Sensitive Data).  According to the customer, they have never experienced a "breachable" event due to this scenario, yet what does occur almost overtime is a coordinated effort among their Privacy, Legal and appropriate business units to investigate the situation and assuage their customers of any concerns. This large-scale response, in conjunction with a relatively high number of the aforementioned occurrences, made this the organization’s top Confidentiality risk.    

Findings

The inadvertent release of Contractual data represented the largest portion of Annualized Loss Exposure (frequency X magnitude over a year) as this data type is most closely tied to the largest segment of their business. PII represented the smallest portion of Annualized Loss Exposure due to a relatively low volume of records tied to each incident.

Risk #3  Data Loss from Device Theft

Overview
This analysis focused on data loss from lost or stolen laptops and mobile devices.  As a good portion of staff have frequent interaction with their customer base, it is not uncommon to find staff with a significant number of sensitive records on their laptops.  On the other hand, when it came to mobile devices, our focus shifted to what was accessible via the mobile device.  According to SME's, staff store data in both approved and unapproved third party file shares (i.e. Evernote, Box, Dropbox, etc.).   

Findings
Losses were greatly reduced in mobile devices as the organization utilizes a Mobile Device Management (MDM) solution, as well as possesses the capability to remotely wipe the data in the MDM.  When it came to laptops, the organization leverages strong authentication and patching procedures, yet their primary control is hard disk encryption.

Risk #4 Third Party Vendor Breach

Overview
The last scenario defined the exposure faced by the bank due to a breach at one of their cloud storage vendors.  There are two cloud storage vendors used by the organization; one sanctioned, the other not so much.  There is better visibility into the controls environment of the approved vendor, while a lack of visibility into the unapproved vendor.  In either case, a vendor is unlikely to provide information on how frequently they experience breach related scenarios.  With that in mind, we leveraged internal data as proxies, as well as Imperva’s Web Application Report to develop our Loss Event Frequency figures.

Findings
Overall, this scenario represented a low risk to the organization.  Although the organization has better visibility into the controls environment of the approved vendor, based on our research neither vendor has ever experienced a breach that has made it into the public eye.

Availability
Risk #5 Ransomware

Overview
The scope of this analysis focused on ransomware encrypting workstations and any mapped shared drives.  Due to the implementation of a key obfuscation control (I can’t say any more than that), the organization has not experienced a successful ransomware infection in some time.  As we knew this control not to be foolproof, we took an average going back a few years to estimate the Loss Event Frequency, or number of successful ransomware incidents.   

Findings

Of the two assets, shared drives represented the largest amount of Annualized Loss Exposure due to a greater impact on staff, and longer back up and restoration procedures.


See also: How Much Risk Is Associated with Ransomware?


Risk #6 DDoS

Overview
For this analysis, we focused on the organization’s primary client-facing application.  Visibility into the number of attempted DDoS attacks (i.e. Threat Event Frequency) was limited, so we reached out to our partners at the Verizon DBIR team for assistance with our estimates.

Findings

Although there were elevated Primary Loss Magnitude costs due to an increased response effort, the overall Annualized Loss Exposure was reduced due to their DDoS mitigation services and roll over procedures.

Integrity
Risk #7 Client Credential Compromise

Overview
This analysis focused again on the organization’s primary facing client application, and the fraud concerns when client credentials have been compromised.  We learned in conversation with their Fraud Department that not all initiated fraudulent transactions (i.e. Threat Event Frequency) are successful (i.e. Loss Event Frequency). The Fraud Department has a short period of time by which to review and stop suspected fraudulent transactions, as well as has the capability to request a wire recall, where fraudulent transactions are returned to the compromised customer.

Findings
Although successful fraudulent transactions (i.e. Loss Event Frequency) were roughly one-quarter of initiated fraudulent transactions (i.e. Threat Event Frequency), the internal investigation and remediation costs per event were relatively substantial.  As a result, this scenario represented not only the organization’s top Integrity scenario, it was also their top risk scenario overall.


Learn more: 3 Common Mistakes Analysts Make Calculating Threat Event Frequency for Web Applications


Risk #8 Web Application Attack

Overview
The basis of this analysis was an audit finding that an internet facing application was “vulnerable” to a specific kind of web application attack.  Through this vector, a malicious actor could authenticate to the application and initiate a fraudulent transaction.  With the basis of this analysis stemming from an audit finding, and NOT an actual loss event, we made a series of calibrated estimates based on the Information Security Department’s Security Code Reviews.

Findings
Although the per event loss magnitude (i.e. the cost spent investigating, remediating, responding to customers, business partners, etc.) was high given the severity of this type of attack, the overall Annualized Loss Exposure was relatively low due in large part to a reduced Loss Event Frequency.  In conversation with SME’s, this type of web application attack was deemed to be relatively complicated, as well as not a threat actor’s first choice when looking to prosper from a fraudulent transaction.

Risk #9 Fraudulent SWIFT transactions

Overview
The basis for this analysis was fresh from the $81 million Bank of Bangladesh SWIFT heist, but the focus was on fraudulent transactions stemming from our client’s SWIFT application.  In discussion with SME’s, we learned that a series of complicated steps were required in order to initiate a fraudulent SWIFT transaction at the bank: enumeration, leading to a spear phishing campaign, compromising two authorized users or switching between admins.  Not to mention, that almost all SWIFT transactions emanate from one of the bank’s other applications, which is to say that a transaction emanating from the SWIFT application would receive special scrutiny.

Findings
As mentioned in the previous analysis, the per event loss magnitude (i.e. the cost spent investigating, remediating, responding to customers, business partners, etc.) was high given the profile and size of transactions facilitated by the SWIFT application, the overall Annualized Loss Exposure was relatively low due in large part to a reduced Loss Event Frequency.  Many at the organization went into the analysis thinking this scenario would represent the highest risk to the bank, probably because they were only considering the impact side (i.e. the magnitude or how bad it will be). What they failed to do was to consider the other side of the equation, the frequency.  Keep in mind, you need both to represent risk in the FAIR model that powers our analyses.


Learn More: How to Scope a Risk Analysis with FAIR


Risk #10 Core Banking Applications Compromise

Overview

In discussion with SME’s, a fraudulent transaction stemming from the core banking applications required a very similar means of attack, and would receive the same degree of scrutiny as the fraudulent SWIFT scenario.  This is to say that the attack requires a high degree of complexity, and the core banking applications are in a tightly controlled environment.

Findings
As much of the frequency and impact factors were found to be relatively the same to the fraudulent SWIFT scenario, so too was the Annualized Loss Exposure, in other words, low Annualized Loss Exposure due to a low Loss Event Frequency.

For those of you who made it all the way to the end, thank you! If this kind of analysis is one that you are considering for your own organization, consider the following:
  1. Why would you conduct the analysis?  What decisions or colleagues are you trying to inform?  This is one of the ways you can begin to narrow down the list of scenarios that should make up your list.
  2. The other way to narrow down that list should be to analyze scenarios that actually cause loss events at your organization.  As I mention above, too many people allow the impact side of the FAIR model to dictate what makes it on the Top 10 list, yet as we saw above, it's those scenarios that have a good mix of both sides of the ontology that make it to the top.
  3. Ten risk scenarios are not a requirement.  Ten is a nice round number, but if six scenarios help you achieve the first point I made above, why would you spend the extra time and effort analyzing another four iffy scenarios? Spend the time and effort where it matters.