Retrieving CI/CD Secrets from Vault
Introduction
CI/CD pipelines need to securely access secrets and other sensitive values and authenticate to external services and deploy applications, infrastructure, or other automated processes. HashiCorp Vault is a central system to store and access data, which lets CI/CD pipelines push and pull secrets programmatically.
This article will cover anti-patterns for secrets management and authentication, and provides guidance and resources for multiple CI/CD platforms.
Vault gives you multiple ways to manage identities and authentication, such as JWT/OIDC, LDAP, TLS certificates, tokens, username & passwords, along with authentication through major cloud providers such as AWS, Azure, and GCP.
Because Vault supports so many authentication methods you can choose how CI/CD pipelines retrieve data, and build flexible workflows.
Resources
Anti-patterns
Hard-coded secrets
Do not store secrets in your code repository [CWE-259]. Your CI/CD pipeline should not contain hard-coded secrets, nor should the code that your pipeline runs. Instead, your pipelines should pull secrets from a secret store, such as CI/CD native secrets store, cloud provider secret store, or a cloud agnostic secret store such as HashiCorp Vault.
Hard-coded Vault authentication
Do not store Vault authentication tokens or passwords in your code repository [CWE-798]. Instead, users or processes should use a secure authentication method, such as JWT/OIDC or an external authentication method, for dynamic authentication with a lifecycle policy.
Static vs Dynamic Secrets
Dynamic secrets
When you request access to a secret, Vault generates a dynamic secret. Dynamic secrets do not exist until a user or system reads them, so there is no risk of someone stealing them or another client using the same secrets. Because Vault has built-in revocation mechanisms, Vault can revoke dynamic secrets immediately after use, minimizing the time the secret exists. Vault supports multiple secret engines that connect to services like CI/CD tools and generate dynamic credentials on demand. Think of these secret engines as plugins for Vault. The plugins connect to external services such as AWS, Azure, GCP, Kubernetes, databases, and more. Once Vault has enabled a secrets engine and authenticated to an external source, users can request credentials to access these external sources.
An example of a CI/CD pipeline using dynamic secrets is a job that needs to reach out to S3 for an object. Instead of hard-coding AWS credentials in code or plaintext files or setting them as CI/CD environment variables, the CI/CD pipeline can authenticate to Vault using one of the supported authentication methods. Once the pipeline authenticates to Vault, Vault issues the CI/CD job temporary credentials that expire once the CI/CD pipeline finishes its job. With these credentials, the job can pull objects from S3.
Use-cases
- Database access
- Cloud access eg AWS, GCP, Azure
- SSH
- Authentication credentials
Tutorials
- Database Credential Rotation
- Dynamic secrets for AWS authentication for S3 access
- Getting started with dynamic secrets
- SSH Secrets Engine: One-Time SSH Password
Static secrets
Vault stores static secrets in the KV (key/value) secrets engine. Once Vault enables the KV secrets engine, users can create KV secrets such as passwords, API keys, and certificates. CI/CD pipelines can authenticate to Vault and retrieve secrets instead of storing secrets in plain text files or code.
An example of a CI/CD pipeline using static secrets is a job that needs to reach out to a Google service. Vault can use the KV secrets engine to store the API key as a static secret. Then, when the pipeline runs, the job authenticates to Vault and retrieves the API key.
Use-cases
- Storing third-party API keys
- Certificates
- Authentication credentials
Tutorials
GitLab
GitLab uses JSON Web Token (JWT) to authenticate with Vault to securely access secrets for CI/CD pipelines. Once authenticated, GitLab can pull static secrets from the KV secrets engine, or dynamic secrets from engines such as the AWS Secrets Engine.
To set up authentication between GitLab and Vault, follow the Using external secrets in CI or Authenticating and reading secrets with HashiCorp Vault guide. Both guides will walk you through authenticating but will handle accessing secrets differently.
Static secrets
To use static secrets, reference the secrets:vault
keyword in the secrets portion of your gitlab-ci.yml
file.
In the following example, secrets:vault
pulls a secret from the Vault K/V store, and sets the value to DATABASE_PASSWORD
environment variable. Jobs
can then use the secret stored in the environment variable to authenticate to the correlating database. Refer to the Use Vault Secrets in a CI Job for further documentation.
You can also pull static secrets and set them to environment variables from the CLI as shown in the Authenticating with HashiCorp Vault tutorial.
The following diagram shows the steps a GitLab CI/CD pipeline takes to retrieve a secret from Vault.
Dynamic secrets
GitLab users are also able to take advantage of dynamic secrets engines. Once you set up JWT authentication to Vault as described above, you can enable a dynamic secrets engine such as AWS Secrets Engine in Vault. The AWS secrets engine lets GitLab CI/CD jobs request short-lived dynamic AWS credentials.
The following is an example of using dynamic AWS credentials in a GitLab job.
Note You may need to set SESSION_TOKEN
in the following example
Tutorials
- Authenticating and reading secrets with HashiCorp Vault
- Using external secrets in CI
- AWS Dynamic Secrets Engine
Resources
Guy Barros, a Senior Solutions Engineer at HashiCorp, maintains a repository with Terraform code to automate the JWT auth method integration between HCP Vault and GitLab. Barros demonstrates how to use the Terraform code in the Codify Your JWT-OIDC Vault Auth Method with Terraform HashiTalks video.
GitLab Unfiltered - How to integrate GitLab CI with HashiCorp Vault to retrieve secrets (via JWT or "secrets:"), uses AWS Quick Start to launch HashiCorp Vault on AWS, and demonstrates how to set up policies, roles, and authentication to Vault.
GitHub Actions
HashiCorp provides a helper Github Action that you can plug into your GitHub Actions CI/CD pipelines. We recommend using a Vault authentication method such as the JWT auth method with GitHub OIDC tokens or the AppRole auth method.
The Vault helper actions allow practitioners to authenticate to Vault in multiple ways, letting developers implement authentication that best fits their CI/CD workflow. Find a complete list of supported authentication methods here.
Once authenticated, GitHub Actions pipelines can
request secrets from any Vault secret engine that
retrieves secrets via GET
requests. For example, if you want to pull dynamic AWS credentials, you can use the
AWS Secrets Engine to generate and retrieve credentials since
the AWS Secrets Engine uses GET
requests.
Tutorials
- Follow the Vault GitHub Actions tutorial to define a GitHub workflow within your repository and request the required secrets with Vault GitHub actions.
- AWS Dynamic Secrets Engine
Resources
- Ricardo Oliveria blog post Push Button Security for Your GitHub Actions walks practitioners through securing their Github workflows using HCP Vault.
- Sheryl-Ann Lee walks through configuring a secure GitOps Workflow with GitHub Actions and Vault.
- GitHub Action - Vault Secrets
Jenkins
Jenkins uses plugins to integrate with many third-party tools. Traditionally you would manage Jenkins secrets for pipelines with Jenkins Credential Management in the Jenkins Controller.sing this plugin binds all the secrets to environment variables. It also masks these variables when "echoed" to the pipeline logs. However, using this plugin can lead to a secret sprawl (external system credentials are stored also in Jenkins, like Git tokens, DB credentials, or other secrets needed by the pipeline). The Jenkins Controller uses RBAC Jenkins user database to manage access controls instead of a more secure granular ACL based on the credentials path.
Note Jenkins has a Controller-Agent architecture, formerly known as Master-Slave
A best practice approach to inject and use secrets/credentials during pipeline execution is to use the API integration with HashiCorp Vault for every pipeline step execution. In this case, Vault manages authorization on the secrets stored based on an external authenticated identity. The Jenkins credentials plugin can secure the authentication process to Vault. When using authentication methods like Vault Approle, Tokens, or JWT, the credentials binding from the Credentials Management plugin will help protect the Vault authentication. The Vault authentication allows the pipelines to make API calls to Vault and retrieve the necessary secrets to complete the pipeline job.
The Jenkins Vault plugin and other methods
Depending on the protection required for showing secrets in the Jenkins pipeline logs, there are different approaches to authenticate to Vault from a Jenkins pipeline.
- Using the Jenkins Vault plugin as an Auth Method helper and secrets binding during pipeline execution. This will mask in the pipeline logs any secret retrieval from Vault.
- Using credentials binding for
VAULT_TOKEN
,VAULT_ADDR
andVAULT_NAMESPACE
, and then doing a REST API call to Vault in the pipeline using the previously masked environment variables. - Implementing Vault Agent at the Jenkins Agent to access required secrets to be shared within different pipelines. This doesn't require the use of extra plugins and it can be a good scenario for ephemeral agents with a Vault Agent sidecar configuration. The disadvantage of this method is that secrets retrieved can be accidentally shown in the pipeline logs.
How to use Vault dynamic secrets in Jenkins
Using the Jenkins Vault plugin, retrieving and protecting dynamic Vault secrets in the pipeline logs is possible.
These secrets include Terraform API tokens, GCP keys, and database credentials, to name a few. We can do that in the
pipeline as code definition by specifying the
engine version 1
specification of the plugin, like in the following example:
If not using the Jenkins Vault plugin, it is possible to do a REST API call to Vault, masking the
VAULT_TOKEN
env variable in the pipeline logs. Masking the
VAULT_TOKEN
env variable is possible using the Credentials Management in Jenkins.
We can also use the Vault Agent plugin to manage the token caching, avoiding the usage of any plugin in that case.
An external process executed in the Jenkins Agent (it can be an admin or operator pipeline, or an init process)
can manage the Vault login that retrieves the token to be used in the
VAULT_TOKEN
environment variable and puts it in a Jenkins credential.
CircleCI
CircleCI uses OIDC tokens to authenticate with Vault. Vault supports OIDC and has a lab to help practitioners learn Vault's OIDC Auth Method. We recommend practitioners complete the lab before setting up CircleCI Vault integration.
Resources
- Rosemary Wang (Developer Advocate, HashiCorp) pairs with Angel Rivera (Developer Advocate, CircleCI) to inject static secrets from HashiCorp Vault into a CircleCI pipeline in their HashiTalks video.
- Video Resources: