Case Study Webinar: RiskLens Settles a Decision on Controls Investment

Listen to this webinar on demand to hear RiskLens Consultant Taylor Chester tell the story of a recent engagement with a large financial organization that started with a basic question: How to decide between two types of controls (purging data or tokenizing records) to protect against malicious exfiltration of data?

As Taylor takes you through it, you’ll see that the RiskLens process is as much about applying the rigorous, question-and-answer method of the FAIR model as it is about running the RiskLens application to achieve the final analysis results in financial terms.

FAIR analysis takes as a starting point a loss event – here, a data breach of personally identifiable customer information – to uncover a probable frequency of occurrence.  Under Taylor’s guidance, the organization discovered it had good records on exfiltration by insiders but ran into a common roadblock when it looked for the same on outsiders: no data. As Taylor explains “No data is data.” The organization had good records – and no sign of outsider attack – for 10 years. “That told us that, at a minimum, the event would likely occur once in every ten years.”

After establishing likely frequency, Taylor and the team next tackled the magnitude or impact side of the FAIR equation. In addition to the primary costs of staff time from the SOC or database team, there would be possible Secondary Loss in the FAIR model, for instance, paying for credit monitoring for customers or for judgment costs in a lawsuit. That required a research effort involving Legal, Marketing, IT and other teams

The big question here came down to: Would purging data – in other words, reducing the number of potentially affected records – produce less impact in a data breach than tokenizing – in other words, reducing the value of the records for attackers by anonymizing them?

With the frequency and magnitude information as inputs, the RiskLens platform produced a surprising answer, with one solution reducing loss exposure by twice as much as the other.

Fill out the form to view the webinar for the details…

 

Listen to this webinar on demand to hear RiskLens Consultant Taylor Chester tell the story of a recent engagement with a large financial organization that started with a basic question: How to decide between two types of controls (purging data or tokenizing records) to protect against malicious exfiltration of data?

As Taylor takes you through it, you’ll see that the RiskLens process is as much about applying the rigorous, question-and-answer method of the FAIR model as it is about running the RiskLens application to achieve the final analysis results in financial terms.

FAIR analysis takes as a starting point a loss event – here, a data breach of personally identifiable customer information – to uncover a probable frequency of occurrence.  Under Taylor’s guidance, the organization discovered it had good records on exfiltration by insiders but ran into a common roadblock when it looked for the same on outsiders: no data. As Taylor explains “No data is data.” The organization had good records – and no sign of outsider attack – for 10 years. “That told us that, at a minimum, the event would likely occur once in every ten years.”

Webinar - Case Study Controls InvestmentAfter establishing likely frequency, Taylor and the team next tackled the magnitude or impact side of the FAIR equation. In addition to the primary costs of staff time from the SOC or database team, there would be possible Secondary Loss in the FAIR model, for instance, paying for credit monitoring for customers or for judgment costs in a lawsuit. That required a research effort involving Legal, Marketing, IT and other teams

The big question here came down to: Would purging data – in other words, reducing the number of potentially affected records – produce less impact in a data breach than tokenizing – in other words, reducing the value of the records for attackers by anonymizing them?

With the frequency and magnitude information as inputs, the RiskLens platform produced a surprising answer, with one solution reducing loss exposure by twice as much as the other.