The U.S. government is stepping up its stance on cybersecurity, and it’s now time for organizations in both the private and public sectors to think about how they will comply. In March, the White House released a new National Cybersecurity Strategy, which calls for a “rebalance” of responsibility from end users to the owners and operators of key tech and infrastructures.
This follows a memo from the Office of Management and Budget late last year on migrating to post-quantum cryptography, which calls on federal agencies to move to a zero-trust architecture, a paradigm shift that “relies in part on the ubiquitous use of strong encryption throughout agencies.”
As cybersecurity becomes a more critical national and global issue, many would like to see an even greater sense of urgency in rolling out new policies and regulations. Still, there are several steps organizations can take right now to ensure their data and infrastructures are secure as quickly as possible, while also setting themselves up to comply with the national and industry-centric guidelines and regulations that will eventually emerge.
As organizations prepare for the inevitable, they need to keep these six actions in mind to stay ahead of the curve:
- Focus on key management
The hard reality is if you don’t own your keys, you don’t own your data. In this vein, the crucial elements organizations need to focus on where key management is concerned include strong encryption algorithms, management across the entire key lifecycle, and full access to backups of your data and your keys. We’re already seeing more and more federal agencies add layers of security “defense in depth,” and one of those areas is automated and coordinated key management.
- Key management doesn’t work without complete visibility
The moral of this story is that you can’t protect what you can’t see. Key provenance and immutable logging are essential. We’ll take a deeper look at this later, but one reason why government agencies consider techniques above and beyond what the public cloud providers offer are for a better handle on immutable logging and centralized logging, as well as the automation of keys and other tasks across both on-premises and cloud workloads.
- Leverage automation for continuous access control
Security is paramount, of course, but being able to automate certain aspects of protecting your data brings it to a whole new level. In recent years, significant strides in automated technologies have helped organizations manage data security more efficiently. Benefits include data classification and risk assessment, least-privilege access (role-based access control), and automated and uniform policy enforcement. Automating mundane, tedious and repetitive tasks frees up human talent to focus on more valuable, creative and rewarding work, helping individuals and organizations reach new levels of innovation.
- Key components to effectively comply with security regulations
There are several components at work when considering the highest level of data security, including:
- Data center security
- FIPS 140-2 hardened appliances
- Network security
- FIPS-validated software
- Secure enclaves protected from the overarching infrastructure
- A side-channel-resistant crypto library
If these elements are effectively in place, organizations can run software security within a trusted execution environment, a physical enclave that also houses key data, secrets data, passwords, tokens and more. This ultimately helps organizations defend against man-in-the-middle attacks and help comply with the Cybersecurity and Infrastructure Security Agency’s new security-by-design and –default guidelines. If someone were to gain root access to an agency or organization’s tech ecosystem, they still wouldn’t have access to vital software, passwords, keys, and secrets because the applications run within a secure enclave or trusted execution environment. Side-channel attacks and memory scraping also become more challenging to execute.
- Where cloud computing fits in
As organizations migrate significant portions of their architecture and data to the cloud, including Microsoft Azure, AWS and Google Cloud, they also need to deploy measures that protect sensitive data in these environments.
More than proprietary key management from the cloud providers is needed. Leveraging cloud-based versions of the previously mentioned technologies gives businesses flexibility, such as key custody across hybrid multicloud infrastructures, streamlined data access policies across providers, and complete key provenance with centralized log exports. If a cloud provider was compromised under these conditions, malicious actors still couldn’t access sensitive information.
- Crucial considerations of internal vs. cloud key management
Organizations need to make several considerations when it comes to owning key management for data privacy and security versus leveraging the cloud service providers to do so. For example, managing keys yourself defaults to the most stringent requirements to cover all geographic and vertical regulations, which minimizes the overall risk. Further, using the cloud service providers’ key management services as the root of trust removes control, which contradicts the principles of a zero trust policy.
There is a trade-off, however: While keeping workloads on-premises helps reduce risk, it may mean that an organization needs to catch up on the innovations that the cloud service providers have become known for. In my mind, however, reducing data security risk as much as possible trumps everything else.
As technology evolves and becomes more complex, so do the techniques of malicious actors looking to compromise systems and steal sensitive data. As new regulations emerge and individual industries and organizations are mandated to meet specific levels of compliance, those who consider the above points as soon as possible give themselves the best chance of success. The recent health data breach that affected the employees of lawmakers and their families is just one example.
Anand Kashyap is CEO and co-founder of Fortanix.