Keeping your app’s secrets secret
The software development process has vastly changed in this past decade. Thanks to the relentless efforts of the cloud and virtualization technology providers, we now have nearly limitless compute and storage resources at our fingertips. One may think of this as the first wave of automation within the application development and deployment process.
With the rise in automation, machines must authenticate against each other. Authorization is nearly implicit in this handshake. Secrets are increasingly used by applications and (micro) services as a bootstrapping mechanism for initiation and continuity in operations. However, these secrets, which are largely credentials, need safe keeping and secure access in order to ultimately protect the end user. If left to their own devices, secrets will sprawl over time leading to a cornucopia of leaks and implications.
In the past, programmers, testers, and release managers found radically new ways to build and deliver applications from development sandboxes to production environments. This emphasized a more rapid software delivery for teams and the classic waterfall model was no longer as desirable for the consumers of the technology. Agile quickly became the buzzword and nearly every software team strived to become leaner in their size and methodology.
A critical requirement in the delivery lifecycle was the concept of a sprint, which divvied up each project into many bursts of short and fast cycles of articulation, programming, testing, and deployment. This drastically increased the quantity of code produced by each team, and thereby put a greater emphasis on code quality and release processes. Testing and deployment thus began their rapid ascent into automation, which has since resulted in a gargantuan proportion of secrets that are created and referenced within code. These secrets could be perceived as static or dynamic in respect to their use and longevity.
With the advent of container technology, the application team, referred to as DevOps, found newly empowered ways to build, test and release. The underlying need for hard resources faded away completely and each team now produced several copies of their software for all manner of consumption.
Containers gave new meaning to software lifecycle as many application components became fragmented with shortened lifespans. Containers would be summoned and discarded with such simplicity that application teams now had to think of their code merely as a (micro) service within a larger ecosystem. These applications would go from being stateful to stateless as services became context-aware only in the presence of secrets.
Containerization is gathering momentum, with Gartner reporting 60 percent of companies adopting it, up from 20 percent just 3 years ago. One can argue about whether Docker or Kubernetes is the more influential offering in this trend, but cloud providers are equally responsible for its adoption.
Regardless, the need for actively managing secrets is now front and center for every application team. The question is whether your application secrets are really a secret or simply a hard-to-reach set of variables. What is needed is a simple prescriptive plan for ensuring better application security for your team. It is no longer the job of DevOps but the collective responsibility of DevSecOps.
Building blocks of application security
Application and/or information security teams need more proactive prevention, while realizing that reactive detection isn’t the main tool in the arsenal. Getting ahead of adversarial code isn’t trivial, but in practice it starts with a few simple steps. Secrets are the sentries to applications and fortifying them requires a proactive approach, including:
1. Application inventory – Every information security leader should take it upon themselves to demand an audit of all applications within the enterprise. Armed with such a list, it is their responsibility to now identify the domains which are critical for business and/or sensitive to the customer. This list is by no means static and should be evaluated periodically to ensure maturing security models and threats. The list may comprise applications (and/or micro services) designed in-house or those leveraged externally from service providers.
Regardless, a matrix of all such applications and services needs to be audited for dependencies on code repositories, data storage, and cloud-augmented resources. Common externalities can be found at GitHub, GitLab, Amazon AWS, Google Cloud, Microsoft Azure, Digital Ocean, OpenStack, Docker Hub, etc. This is not a comprehensive list, so organizations should cautiously audit each application and service for its dependencies in-house as well as externally.
Upon discovery of the repositories housing the business critical or customer sensitive information, it is time to forge a plan for the security of content residing in each. This acts as a manifesto for the enterprise, to which application teams must adhere. Established practices such as peer reviews and automation tools can ensure violations are mitigated in a timely manner, if not completely avoided. Teams can appoint a Data Officer or Curator who is responsible for maintaining the standards and ensure compliance.
2. Code and resource repository standards – At a bare minimum, applications must encrypt data at-rest transparently and transmit it securely over the network or across processes. However, there are times when even computation of the data within a process needs to occur securely. These are usually privileged processes that act upon highly sensitive data and must either do so using homomorphic encryption or a secure enclave, after weighing the practicality of either approach.
The next best option is to tokenize all sensitive data so the encryption preserves the original format as per NIST publication 800-38G. Applications and services can continue to work with the tokenized content unless a privileged user or entity must ascertain the original content through an authorized request.
Whether an application relies on encryption or tokenization, it needs to store, access, and assert the rights of users and other services. Hence, it all comes down to a core set of secrets that applications rely upon in order to function normally as per the rules set forth by its owners. When it comes to management of application secrets, several guidelines are available, ranging from the OWASP Top 10 to CSRF and ESCA.
Secrets were often used primarily to encrypt data at rest and in transit but are increasingly used for identity and access management. Secrets are littered across application delivery pipelines. They are found in the code or configurations directly as credentials themselves or as references to certificates or keys that are reused with suboptimal entropy to generate secrets.
Most often these secrets manifest themselves as environment variables that are passed to containers and/or virtual hosts. Securing the secrets – and, more importantly, providing the highest level of security for access to the secrets – becomes paramount to the application architecture.
3. Centralize secrets with dynamic credentials – There is a multitude of services and products that claim to provide security for application secrets. As a CISO, it is incumbent to ask what makes a product or service secure. The answer comes down to a phrase – root of trust, which is now being uprooted by the concept of zero trust.
Almost all products and services offering secrets management are based on the former root of trust model, where the master key needs to be secured, which is not a trivial undertaking given the hybrid or complex nature of deployments and dependencies. DevOps or DevSecOps is eager to vault or conjure all secrets and summon them freely across containers, hosts, virtualized services etc. What many do not realize is that the processes running these secret repositories are quite vulnerable and leak a plethora of ancillary secrets.
Enterprises can no longer assume that teams are sufficiently mindful when it comes to application architecture, since there are so many options that check-off that box so security will allow teams to stay on track or within a budget. By allowing this to continue, enterprises have created human gatekeepers as the critical bearers of information security and thereby increase their risk of exposure and leaks.
As NIST publication 800-207 comes to bearing, many enterprises will realize the need for a true “Zero Trust” application architecture. This is available today for applications built on container orchestration platforms such as Google Kubernetes or OpenShift, as well as from leading cloud services rendered on Azure, Google and AWS. Authentication (AuthN) and authorization (AuthZ) have become intertwined and with the advent of mutual authentication, it is the foundation for building zero trust within the application.
Fundamentally, a client is always requesting a service (or server) for resources. Zero trust in this transaction would translate to validated provenance of the client and server to enable claims on resources based on associative rights. Trustworthy JSON Web Tokens are increasingly becoming the standard in this paradigm of strong security with roots in cryptography. Servers will deny any resource claims for invalidated or expired tokens and similarly clients need not accept unverifiable responses. Having centralized secrets management with strong access controls and a robust API is critical to application security.
Secrets management: Summary
The age of automation is just beginning and information security goes hand in hand with end user privacy and business continuity. We should be forewarned by the stream of attacks that often could be thwarted by simple practices that were established gradually over time at the core of the enterprise.
Application teams may find it easier to pilot a single service more securely in this manner rather than awaiting the information security leader or CISO to codify it within the enterprise. The need for a proven secrets management application or service is ever present. Pick a solution that is:
1. Flexible in its deployment model whether on-premises or natively in the cloud, or some combination (hybrid, multi-cloud etc.)
2. Secure in a way that goes beyond a simple key-value store that most secrets management providers ultimately provide
3. Capable of connecting to other applications and services through open standards such as OAuth, OpenID (SAML), LDAP, Trustworthy JWT and PKI
4. Proven to work for national agencies and regulatory bodies alike, since these entities have pivotal security considerations.