A 34-year veteran of the National Security Agency as an information assurance professional, current Chief Evangelist for the Center for Internet Security, an all around nice guy and heck of a guitar player too--when Tony Sager speaks, people in cybersecurity listen. Lately, Sager has been talking up the infosec profession’s problem with the “fog of more,” a takeoff from the phrase “fog of war” which describes the chaos and confusion of battle.
Watch Sager discuss the fog of more in this MIS Training Institute (MISTI) video.
Here’s his explanation, from the MISTI interview:
“Never in our history as defenders, have we had so much to work with. So many tools, training, certifications, events for learning, threat feeds to buy, services to contract for. It’s an incredible array of riches.
“Yet we don’t seem to be getting better. Actually, I think we are getting better but the bad guys are much more agile and get better faster…
“Why can’t we keep up? For me, the fog of more is a way to convey it’s not the lack of resources. It’s that there’s too much. So, people become paralyzed. They’re overwhelmed not only by the technology problems and the changes in the business--the way it uses technologies and the demands of the customers--but the emergence of new technology, [and] the bad guys are changing all the time. So, the challenge is sorting all that out.”
Sager suggests these three fog-cutters:
1. Look for guidance from the collected wisdom of the profession – “a lot of really smart people have been in this business for a long period of time.” Specifically, he suggests the Center for Internet Security’s CIS Benchmarks (100+ configuration guidelines for various systems, developed by a global community of security experts) or the popular NIST Cybersecurity Framework (CSF).
2. Create an “action architecture…If you haven’t put in place that architecture of visibility and management, then all the information in the world won’t save you…that’s what the CIS controls are about.”
Sager believes this is a prudent approach because “80-90% of these [cybersecurity] things are just repeats of the same problem” and “you always have to prioritize in defense. The only question is what do you use to prioritize? And if you accept the idea that we all face this kind of mass market, common attack thing, we all need a basic infrastructure.”
Sound advice on the basics, and the CIS and NIST best-practice inventories and checklists do collect the wisdom of the security community into how to combat 80-90% of threats.
BUT here is what's missing - if you truly want to prioritize and if you want to build an even better “action architecture” - you need to take a risk-based approach.
When it comes to risk, the checklist approach makes an assumption: Do more of the best practices and you will reduce your risk more. A common problem in security (and one that seems to be heating up these past few weeks as 'risk' becomes a major topic) is that we conflate vulnerability management with risk management.
At RiskLens, we’re fans of the FAIR model that powers our application. The model defines risk as the probable frequency and probable impact of a future loss from a cyber event. That builds on data from the particular experience of an organization with cyber loss events. The result is a range of probable losses (in other words, risk) in monetary terms, specific to that organization. With that in hand, the model can run the effect on risk reduction of various forms of controls.
When an organization understands its current and potential risk, it has a clear idea of how to choose among those best practices and prioritize investment in security controls for maximum impact – less fog, better architecture, as Tony Sager might say.