‘Security Of the Pipeline’ and ‘Security In the Pipeline’ with AWS DevOps Tools By Design

There are many great tools out there for building CI/CD pipelines on AWS Cloud, for the sake of simplicity I am limiting my discussion to AWS native tools.

In-Short

CaveatWisdom

Caveat: Achieving Speed, Scale and Agility is important for any business however it should not be at the expense of Security.

Wisdom: Security should be implemented by design in a CI/CD pipeline and not as an afterthought.

Security Of the CI/CD Pipeline: It is about defining who can access the pipeline and what they can do. It is also about hardening the build servers and deployment conditions.

Security In the CI/CD Pipeline: It is about static code analysis and validating the artifacts generated in the pipeline.

In-Detail

The challenge when automating the whole Continuous Integration and Continuous Delivery / Deployment (CI/CD) process is implementing the security at scale. This can be achieved in DevSecOps by implementing Security by Design in the pipeline.

Security added as an afterthought

Security by Design

Security by Design will give confidence to deliver at high speed and improve the security posture of the organization. So, we should think of the security from every aspect and at every stage while designing the CI/CD pipeline and implement it while building the Pipeline.

Some Basics

Continuous Integration (CI): It is the process which start with committing the code to the Source control repository and includes building and testing the artifacts.

Continuous Delivery / Deployment (CD): It is the process which extends the CI till deployment. The difference between Delivery and Deployment is, in Delivery there will be a manual intervention phase (approval) before deploying to production and Deployment is fully automated with thorough automated testing and roll back in case of any deployment failure.

DevSecOps AWS Toolset for Infrastructure and Application Pipelines

By following the Security Perspective of AWS Cloud Adoption Framework (CAF) we can build the capabilities in implementing Security of the Pipeline and Security in the Pipeline.

Implementing Security Of the CI/CD Pipeline

While designing security of the pipeline, we need to look at the whole pipeline as a resource apart from its security of individual elements in it. We need to consider the following factors from security perspective

  1. Who can access the pipeline
  2. Who can commit the code
  3. Who can build the code and responsible for testing
  4. Who is responsible for Infrastructure as a Code
  5. Who is responsible for deployment to the production

Security Governance

Once we establish what access controls for the pipeline are needed, we can develop security governance capability of pipeline by implementing directive and preventive controls with AWS IAM and AWS Organization Policies.

We need to follow the least privilege principle and give necessary access, for example a read only access to the Pipeline can be given with the following policy

{
  "Statement": [
    {
      "Action": [
        "codepipeline:GetPipeline",
        "codepipeline:GetPipelineState",
        "codepipeline:GetPipelineExecution",
        "codepipeline:ListPipelineExecutions",
        "codepipeline:ListActionExecutions",
        "codepipeline:ListActionTypes",
        "codepipeline:ListPipelines",
        "codepipeline:ListTagsForResource",
        "iam:ListRoles",
        "s3:ListAllMyBuckets",
        "codecommit:ListRepositories",
        "codedeploy:ListApplications",
        "lambda:ListFunctions",
        "codestar-notifications:ListNotificationRules",
        "codestar-notifications:ListEventTypes",
        "codestar-notifications:ListTargets"
      ],
      "Effect": "Allow",
      "Resource": "arn:aws:codepipeline:us-west-2:123456789111:ExamplePipeline"
    },
    {
      "Action": [
        "codepipeline:GetPipeline",
        "codepipeline:GetPipelineState",
        "codepipeline:GetPipelineExecution",
        "codepipeline:ListPipelineExecutions",
        "codepipeline:ListActionExecutions",
        "codepipeline:ListActionTypes",
        "codepipeline:ListPipelines",
        "codepipeline:ListTagsForResource",
        "iam:ListRoles",
        "s3:GetBucketPolicy",
        "s3:GetObject",
        "s3:ListBucket",
        "codecommit:ListBranches",
        "codedeploy:GetApplication",
        "codedeploy:GetDeploymentGroup",
        "codedeploy:ListDeploymentGroups",
        "elasticbeanstalk:DescribeApplications",
        "elasticbeanstalk:DescribeEnvironments",
        "lambda:GetFunctionConfiguration",
        "opsworks:DescribeApps",
        "opsworks:DescribeLayers",
        "opsworks:DescribeStacks"
      ],
      "Effect": "Allow",
      "Resource": "*"
    },
    {
      "Sid": "CodeStarNotificationsReadOnlyAccess",
      "Effect": "Allow",
      "Action": [
        "codestar-notifications:DescribeNotificationRule"
      ],
      "Resource": "*",
      "Condition": {
        "StringLike": {
          "codestar-notifications:NotificationsForResource": "arn:aws:codepipeline:*"
        }
      }
    }
  ],
  "Version": "2012-10-17"

Many such policies can be found here in AWS Documentation.

Usually, Production, Development and Testing environments are isolated with separate AWS Accounts each for maximum security under AWS Organizations and security controls are established with organizational SCP policies.

Threat Detection and Configuration Changes

It is a challenging task to see to that everyone in the organization follows best practices in creating the CI/CD pipelines. We can automate the changes in configuration of pipelines with AWS Config and take the remediation actions by integrating lambda functions.

It is important to audit the Pipeline access with AWS CloudTrail to find the knowledge gap in personal and also find the nefarious activities.

 Implementing Security In the CI/CD Pipeline

Let us look at Security in the Pipeline at each stage. Although there could be many stages in the Pipeline, we will consider main stages which are Code, Build, Test and Deploy. If any security test fails, the Pipeline stops, and the code does not move to production

Code Stage

AWS CodePipeline supports various source control services like GitHub, Bitbucket, S3 and CodeCommit. The advantage with CodeCommit and S3 is we can integrate them with AWS IAM security and define who can commit to the code repo and who can merge the pull requests.

We can automate the process of Static Code Analysis with Amazon CodeGuru once the code is pushed to the source control. Amazon CodeGuru uses program analysis and machine learning to detect potential defects that are difficult for developers to find and offers suggestions for improving your Java and Python code.

CodeGuru also detects hardcoded secrets in the code and recommends remediation steps to secure the secrets with AWS Secrets Manager.

Build Stage

Security of the build servers and their vulnerability detection is important when we maintain our own build servers like Jenkins, we can use Amazon GuardDuty for threat detection and Amazon Inspector for vulnerability scan.

We can avoid the headache of maintaining the build server if use fully managed build service AWS CodeBuild.

The artifacts built at the build stage can be stored securely in ECR and S3. Amazon ECR supports automated scanning of images for vulnerabilities.

The code and build artifacts stored on S3, CodeCommit and ECR can be encrypted with AWS KMS keys.

Test Stage

AWS CodePipeline supports various services for test action including AWS CodeBuild, CodeCov, Jenkins, Lambda, DeviceFarm, etc.

You can use AWS CodeBuild for unit testing and compliance testing by integrating test scripts in it and stop the pipeline if the test fails.

Open source vulnerabilities can be challenging, by integrating WhiteSource with CodeBuild we can automate scanning of any source repository in the pipeline.

Consider testing an end-to-end system including networking and communications between the microservices and other components such as databases.

Consider mocking the end points of third-party APIs with lambda.

AMI Automation and Vulnerability Detection

It is recommended to build custom pre-baked AMIs with all the necessary software installed on the AWS Quick Start AMIS as base for faster deployment of instances.

We can automate the creation of AMIs with Systems Manager and restrict the access to launch EC2 instance to only Tagged AMIs while building the pipeline for deploying infrastructure with CloudFormation.

It is important to periodically scan for vulnerabilities on the EC2 instances which are made up of our custom AMIs, Amazon Inspector which is a managed service makes our life easy in finding the vulnerabilities in EC2 instances and ECR images. We integrate the Amazon Inspector in our pipeline with Lambda function and we can also automate the remediation with the help of Systems Manager

Deploy

When we have an isolated production environment in a separate Production account, consider building AWS CodePipeline which can span across multiple accounts and regions also. We just need to give necessary permissions with the help of IAM Roles.

Even the user who control the production environment should consider accessing it with IAM Roles whenever required which can give a secure access with temporary credentials, this helps prevents you from accidentally making changes to the Production environment. This technique of switching roles allows us to apply security best practices that implement the principle of least privilege, only take on ‘elevated’ permission while doing a task that requires them, and then give them up when the task is done.

Implement Multi-Factor Authentication (MFA) while deploying critical applications.

Consider doing canary deployment, that is testing with a small amount of production traffic, which can be easily managed with AWS CodeDeploy.