CICD on AWS using CodeBuild, CodeDeploy and CodePipeline

AWS CodeDeploy

In this post, we will see how AWS CodeDeploy can be used to automate deployments of applications to servers either running in the cloud or on-prem or into the lambda servers. We will first see where AWS CodeDeploy fits into the CI/CD setup.

AWS CodeDeploy is a fully managed deployment service. AWS CodeDeploy can be used to deploy your build artifacts into various environments.

For example, CodeDeploy can be used to take your Java libraries and/or executables and deploy them onto your web servers or take your updated lambda function and deploy it into the lambda servers.

CodeDeploy makes it effortless to roll out application changes quickly and in automated fashion ensuring that new features are available to your end users as early as possible while avoiding downtime of the current system.

As we’ll see later on in this lecture, AWS CodeDeploy can be triggered to deploy automatically as part of the code pipeline release process. CodeDeploy uses an agent-based method for performing deployments on servers, whether they are EC2 instances or on-prem servers.

CodeDeploy agent is not required for deployments that target the serverless lambda servers.

AWS provides an agent installable software package for both Linux and Windows operating systems. Once installed, the agent communicates back with the CodeDeploy service using outbound connections over the HTTPS protocol in port 443, polling for code deployment updates. The behavior of the agent can be controlled via a local configuration file.

AWS CodeDeploy acts as the deployment system for your software projects. It is the service that is used to deploy your build artifacts stored in S3 onto our servers or into the lambda servers.

As we’ve already seen, AWS CodeBuild can be used to build the artifacts which can then be given to AWS CodeDeploy for deployment.

Installing the CodeDeploy agent is very simple. AWS provides install scripts that can be used to bootstrap the agent during launch.

Scripts or install commands exist for common operating systems such as Windows, Amazon Linux, Red Hat Enterprise Linux, and/or Ubuntu. The following example can be used to install the CodeDeploy agent on a Linux operating system that uses yum as its package manager such as the Amazon Linux OS.

Either set of commands can be integrated into the user data section of an EC2 instance launch to have the CodeDeploy agent automatically installed during launch.

CodeDeploy uses either a JSON or YAML-based AppSpec file to drive the deployment depending on deployment endpoint.

If you are deploying a lambda function to the lambda servers, then you can use either JSON or YAML within your AppSpec file. However, if you are deploying to an EC2 instance or on-prem server, then you must use YAML only. Within this file, you specify the install sequence of steps which may include specific commands and/or scripts that you want to run. The following example as shown here demonstrates a potential WordPress installation and deployment consisting of scripts to perform installation, create database, and start and stop the web server.

In this next example, the AppSpec file is used to install and set up an NGINX-based website. Again, the AppSpec configuration consists of scripts to perform the installation, deploy the website files, and start and stop the NGINX servers.

Deployment Groups

AWS CodeDeploy has the concept of deployment groups. A deployment group represents a group of servers that have the CodeDeploy agent installed on and for which you will deploy a version or revision of your software application to. Deployment groups can be configured either by specifying:

  • an auto scaling group,
  • EC2 instances identified by tags, or
  • on-prem servers identified by tags.

A deployment group can consist of any combination of the previously three mentioned options.

The steps to provision a new deployment bridge within AWS CodeDeploy are simple. The key configuration attributes are deployment group, represents the set of instances or servers to which the deployment will be rolled out to. Revision, represents the version of the software package that will be deployed and installed onto the deployment group.

Controlling the deployment Speed

When performing a deployment, you can control the speed of your deployment in terms of how many of your instances or servers are updated at once. Or in the case of a lambda function, how traffic is rerouted incrementally from the old lambda function to the new lambda function.

Pre-defined deployment configurations

To facilitate and control how the deployments take place, you have the option of choosing from a number of deployment configuration presets. Or if required, you can create your own custom deployment configuration.

Integration options

AWS CodeDeploy provides several integration options.

  • CodeDeploy CloudWatch Events: CodeDeploy publishes events to CloudWatch such that you can detect and react to changes in the state of an instance or a deployment.
  • CodeDeploy Deployment Group Triggers: With the deployment group triggers, you can register an SNS topic to receive deployment event messages. You can subsequently subscribe to the topic and perform any necessary downstream actions, et cetera.
  • AppSpec file: As we’ve already seen, the AppSpec file gives you the capability to hook in commands or custom scripts to react before or after particular deployment events.

AWS CodeBuild

AWS CodeBuild can be used to compile, build and test your source code.

AWS CodeBuild is a fully managed build service. AWS CodeBuild can be used to compile source code to generate build artifacts. For example, CodeBuild can be used to compile Java source code.

Another interesting use case for CodeBuild is where it can be used to build and compile Docker images, then register these into ECR.

As we will see later on in this post, AWS CodeBuild can be triggered to build automatically off commits made into a CodeCommit repository. More on that later.

Beneath the hood, AWS CodeBuild uses Docker containers to manage and execute builds. CodeBuild supports several out of the box preconfigured Docker container build environments: .NET Core, Java, Ruby, Python, Go, NodeJS, Android and Docker are all supported.

Regardless of the supported list, you can create your own custom build Docker images preloaded and configured with whatever build tools you require. The resulting custom Docker image is then uploaded into Amazon EC2 Container Registry or the Docker Hub Registry and then becomes available to be configured into your own CodeBuild projects.

BuildSpec file

CodeBuild uses a YAML based BuildSpec file to drive the build. The structure and outline of the BuildSpec file is shown here.

Within this file, you specify the build phases or sequence of steps, and then the particular build commands used to tune your source code and to build artifacts.

Let’s now take a closer look at a couple of examples. In the first example, the BuildSpec file, as shown here, is used to build a Java based application. It first downloads and installs Maven, then uses Maven to download library dependencies, perform complication and packaging. The end result is a JAR file which is referenced as a file within the artifacts section.

In this second example, the BuildSpec file instructs CodeBuild to compile a Docker image.

The BuildSpec file consists of three build phases, pre-build, build and post-build. Assuming that a Docker file existed in the same directory as this BuildSpec file, CodeBuild will go ahead and build the required Docker image as per the instructions contained within the Docker file. The resulting Docker image is then automatically uploaded and registered into ECR. Several environment variables are injected at build time.

How it fits into CICD?

As already described, AWS CodeBuild acts as the build or compile system for your software projects. It is the service into which your source code is passed and for which then undergoes a build or compile phase.

As we will see later:

AWS CodePipeline is the service that glues or plumbs AWS CodeCommit and AWS CodeBuild together through the use of a pipeline.

When this is set up and configured correctly, commits into a CodeCommit repository will trigger your respective CodeBuild project to kick off and build the expected build artifacts.

How to provision it?

The steps to provision a new build project within AWS CodeBuild are fairly simple. The key configuration attributes are source provider, the location of your source code. Currently, CodeBuild supports the following source control locations: S3, CodeCommit, Bitbucket, GitHub and GitHub Enterprise.

Build container, the Docker build container that will be used to perform the build. This can be selected from a list of preconfigured Docker containers such as .NET Core, Java, Ruby, Python, Go, NodeJS, Android or a custom one that you provide.

Artifacts location, the location to which the build artifacts will be stored. Currently only supports S3.

Integrations

AWS CodeBuild provides several integration options.

  • CodeBuild CloudWatch events: CodeBuild publishes events to CloudWatch such that you can detect and react to changes in the state of your build.
  • S3 artifacts When a build project completes, you have the option to save your build artifacts out to an S3 bucket. You can leverage S3 bucket triggers to trigger other actions. For example, a lambda function to perform some kind of action.
  • BuildSpec file: As we’re already seen, the BuildSpec file gives you the capability to hook in commands or custom scripts to be performed at particular times within a build.

AWS CodePipeline

AWS CodePipeline and how it can be used to orchestrate the delivery of your software from source code to executable and deployable releases.

AWS CodePipeline is a fully managed, continuous delivery system used to automate the building, testing, and deployment of your code. CodePipeline provides visualization tools that give you the ability to understand the high level steps involved and orchestrated by CodePipeline. The CodePipeline console allows you to see the current state of any release and whether the overall pipeline has succeeded or has been blocked by perhaps a failed test. AWS CodePipeline has many great features. Some of which include:

  • the ability for automation of build test and release stages,
  • manual approvals,
  • pipeline history reports and
  • pipeline status visualizations.

AWS CodePipeline is used to orchestrate the various phases within your CICD setup. It does so by using the concept of a pipeline.

AWS CodePipeline has seamless integration into other AWS development services, such as CodeCommit, CodeBuild, and CodeDeploy. As an example, AWS CodePipeline can be used to create a pipeline which gets the latest code commit, runs it through a build phase then through a testing phase, then deploys the release to staging, followed by a manual approval process to push the release to production.

Understanding how to set up automated build and release processes with CodePipeline requires familiarization with several concepts. Let’s cover these now. A pipeline is constructed using one or many stages. Each stage can, itself, have one or many actions. Actions within a stage run either in parallel or sequentially. A transition exists between stages.

CodePipeline has six action types. They are, in no particular order, approval, source, build, test, deploy, and invoke. Let’s cover off each of these now.

  • Approval: the approval action is used to add a manual approval step. This is useful when you want to insure that your software release is tested by somebody before deploying into production.
  • Source: the source action is used to specify the location of the source code. This action type, for example, can be configured to pull the source from or from a CodeCommit repository or from a GitHub repository.
  • Build: the build action is used to specify the build process. For example, how the source code gets compiled. This action type, for example, can be configured to perform the build phase using any one of the following build providers- AWS CodeBuild, Jenkins, or Solano CI.
  • Test: the test action is used to specify the testing process. For example, how the release is to be tested before deployment. This action type, for example, can be configured to perform the test phase using any one of the following test providers- AWS CodeBuild, Jenkins, BlazeMeter, Ghost Inspector, and a few others as well.
  • Deploy: the deploy action is used to specify the deployment process. For example, how the release is to be deployed. This action type, for example, can be configured to perform the deployment phase using any one of the following deployment providers- ECS, CodeDeploy, CloudFormation, or Elastic Beanstalk.
  • Invoke: the invoke action is used to provide further integration options and flexibility by allowing you to implement custom lambda functions that can be called directly from within the pipeline via the section type.

The steps to set up a new AWS CodePipeline build and release process are fairly simple. The AWS CodePipeline console provides a nice wizard driven approach for constructing your pipelines. The end result of stepping through the guided create pipeline wizard will be a pipeline configured with three stages- a source stage, build stage, and staging stage, assuming each of these stages was configured with an action. The resulting pipeline, having completed the creation process, looks like the following. From here, you can enter the pipeline and customize further by adding in extra stages, adding or removing actions within a particular stage, or perhaps updating the execution of order of actions from sequential to parallel or vice versa. Integration options within AWS CodePipeline are numerous, which is a good thing. Software projects come in all sorts of shapes and sizes and may be governed by different methodologies.

Integrations

With this in mind, CodePipeline and its many integration options provide you with the flexibility to be able to mold the perfect build and release process that works for your project. Let’s cover off what options are available to you. As already mentioned, CodePipeline supports six action types- source, build, test, invoke, approve, and deploy. Each one of these action types has a set of different options allowing you to call out to the system or service of choice. In particular, the invoke action allows you to call out to a custom lambda function that may perform some custom activity. Next, AWS CodePipeline supports CloudWatch Events. You can use CloudWatch Events to detect and react to state changes within your pipelines. This capability allows you to model in notifications totopics, which can then kick off lambda functions that may perform some other feature within your build and release process. Pipelines created within CodePipeline are, by default, configured to use Amazon CloudWatch Events to be automatically started when a new commit is detected in CodeCommit. Alternatively, this behavior can be swapped for a polling mechanism where CodePipeline instead polls periodically for change undertaken within your source repository.

Tags:

Updated: