Three major gaps in the Cyberspace Solarium Commission’s report that need to be addressed
Released in March 2020, the Cyberspace Solarium Commission’s report urges for the U.S. government and private sector to adopt a “new, strategic approach to cybersecurity,” namely layered cyber deterrence.
Among the Commission’s lengthy 182-page report’s recommendations are that security vendors must be responsible for providing security updates for their products or services as long as they are providing usability updates and bug fixes. Additionally, the report calls for Congress to “pass a law establishing that final goods assemblers of software, hardware and firmware are liable for damages from incidents that exploit known and unpatched vulnerabilities.”
The purpose of the Cyberspace Solarium Commission’s report is noble, and while the report does acknowledge that the recommendations within will not solve every problem, there are still a few key gaps that are worth highlighting and need to be addressed.
Primarily, what this report demonstrates is poor kill-chain based thinking and it misses the complexity challenge we all face in the tech-enabled critical infrastructure world. For example, most breaches are achieved by exploiting well-known and often old vulnerabilities for which patches and remediations are available but have not been implemented. most likely because of complacence or compatibility issues. But that’s just the beginning of an attack – the core organizational security investments go towards spotting/stopping attackers’ steps after they exploit that vulnerability. This includes everything an attacker executes across the kill-chain, methods used to attain persistence, evade other defenses, escalate access, and eventually take actions and steal data.
The report’s focus on vulnerability management is a futile gesture aimed at “making the wall taller,” as opposed to offering comprehensive guidance on how to offer a “combined arms” defense-in-depth improvement. Aiming to make software more secure and less vulnerable is often the root cause of many cybersecurity woes and a thing that can be improved, but the reality is that it is impossible to do so given the volume and complexity of the software in use.
A better strategy would be to improve regulation and make application and enterprise security testing much more prominent and important as a requirement. Annual audits don’t cut it, but continuous assessment, using useful and machine consumable threat intel are now market accessible.
Liability for unpatched vulnerabilities would impact software market
Many vulnerabilities have been disclosed over the years and some are more severe and likely to be exploited than others. For example, CVE-2014-6271, also known as Shellshock, is considered to be one of the worst bugs ever. It affected most versions of macOS, Linux and Unix and enabled bad actors to execute malicious code on vulnerable systems. This is an example of a vulnerability that most cybercriminals could easily exploit and organizations should patch as soon as possible.
The majority of organizations that suffer a breach due to the exploitation of a vulnerability fail to implement a patch that already exists. In those cases, the onus should be completely on the user and not the vendor. However, some liability for vendors is reasonable – but we must also understand that no amount of quality control is going to prevent a hacker from finding a hole, and no company can afford to do that. For example, WannaCry and NotPetya exploits were both allegedly created by state-sponsored adversaries. It’s farfetched to think that any vendor has the resources to compete with a (military) hacking organization.
With this liability, software vendors will add more quality control, therefore raising the end price of all software and limiting innovation. At the same time, the risk of entering this market may end up being too high for small vendors. But there must be a balance.
The Cyberspace Solarium Commission needs to reconsider how this liability should work – stiffening penalties for negligence in software creation, especially for multi-billion dollar tech firms, seem reasonable but it needs to be balanced with the cost trade-offs, innovation dampening, and other effective ways to deal with the challenges stemming from vulnerabilities. This is like saying the U.S. will require carmakers to make crash proof cars so we can eliminate airbags (and sue the carmaker when cars turn out to not be crashproof).
The report highlights poor kill-chain based thinking
Quite frankly, problems that are much bigger than unpatched vulnerabilities are emerging in cybersecurity space today. In fact, less than 20% of breaches stem from the exploitation of vulnerabilities, according to Verizon’s 2020 Data Breach Investigation’s Report. The biggest trend in exploitation is configuration exploitation: misconfigured cloud servers and exposed credentials stored in software repos are the culprit behind several companies’ data breaches. All of this accountability lies on the end consumer and expresses further why a reliable defense in depth strategy is required.
Organizations should still seek to drive down attackers’ initial access points by focusing on vulnerability management, but they must also realize that vulnerabilities are just one of many initial points of entry into an enterprise’s network. The real damage is done when threat actors move laterally and takes other actions within the environment, often operating like an insider.
But the Cyberspace Solarium Commission’s report does not state who is accountable for this movement. There needs to be a defense strategy that would holistically deal with the problem of layered security controls not working effectively.
The report fails to address how vendors can do a better job of testing before their product or service is even released. Continuous testing is not standardized or required in any meaningful way, but security vendors can take a similar approach to their systems development life cycle’s (SDLC) testing stage and ensure that their products are working to defend against the threats they are designed to prevent.
Organizations need a layered defense approach to security. Instead of relying purely on vulnerability patches, companies can operationalize a rich body of technical knowledge that expresses how attackers operate as well as the methods and software they use. We can now do a much better job by using emerging technology in tandem with this knowledge to emulate attacker behaviors to validate each layer of defense in depth to ensure that everything is working and reliable. By making this a policy, the Cyberspace Solarium Commission’s recommendations would be much more practical.
Take a threat-informed approach to defense, not a patch-reliant one
While the U.S. government and private sector both have areas they could work on according to the Commission, end-user organizations must understand that continuously testing security controls against relevant TTPs will help prepare for what’s next when an attacker penetrates their network.
The insights from these tests will help measure the effectiveness of those defenses and help execute continuous improvements. When coupled with a strategy for driving down initial access through vulnerabilities, the end result is an improvement in cybersecurity posture and increased business operations efficiency through a threat-informed (not patch-reliant) defense.