“An SRE is more focused on monitoring and analytics”.
“A DevOps engineer is more focused on the setup and creation of infrastructure”.
Tools: Jenkins, Pipelines, Prometheus and Ansible
Terms: CI/CD, Pipelines, Configuration Management, Monitoring and Log Aggregation.
To facilitate software development, Continuous Integration / Continuous Development (CI/CD) has become a swiss army knife of development, deployment, and testing. One of the most popular and oldest currently maintained frameworks for CI/CD is called Jenkins. Jenkins was released in 2011 and is open source. In addition, Jenkins is highly modular and supports a multitude of plugins.
Most of our focus in CI/CD will be on pipelines. With pipelines we are able to use code checked into a
Git repository to control the execution, linting, security testing, and performance testing of code.
Goals: The goal is to develop the skills needed to:
- a. create an environment from code (infrastructure as code) with
- b. create a development pipeline on this infrastructure with
- c. monitor and maintain this infrastructure through the use of monitoring, metrics, and log analytics using
One of the beautiful things about the field of DevOps is the flexibility and interoperability of the different components, which can be compared to Lego pieces. In this post, we will start to understand how the building blocks of development, monitoring, and deployment fit together and complement each other.
Jenkins is the swiss army knife of build tools, and it gives you a huge amount of extendability, flexibility, with a lot of plugins and can use and support a lot of different languages.
Jenkins-X: It is Jenkins for Docker and microservices. It integrates well with Kubernetes.
AWS IAM setup
Basically, when you start you use the
root credentials, but we want to use best practices within the space, so we are going to create a new user that will actually have access to the resources that we specifically assign them to do so. That’s because you want to have a minimum permission model. So, let’s dive into IAM within AWS - which is really the foundation for security in an AWS environment. So, with this tool, we are really able to lock it down so that we can keep out bad actors, and we can also adopt a minimum permission model for our user.
Step 1: Create a
Minimum_Security_Model: Normally you would have chosen one of the pre-configured policies, but lets create one that is custom to our use. Let’s give it Full Access to
Step 2: The next step is to use that Policy to create a Role. But before you create that role, first create a new group called sk_DevOps. Then we are going to apply the policy to this group. Which I guess becomes a Role at that point.
Step 3: The last step is now to add our user to the group that has the attached policy.
Login to the console.
AWS EC2 setup
Login to the aws console using the user you just created. Navigate to EC2 instances and click on
Launch Instance. Filter Ubuntu resources and select the one shown below:
Click through the defaults. Give it a tag:
Name: Jenkins. Next, is the most important step, where you need to set the IP to
MyIP. This is because, we want to only give yourself access from your local IP address.
People scan the internet 24-hours a day for new instances that are coming online on AWS, and then attempt to compromise those hosts. So you don’t want to be one of those people whose host gets compromised.
NOTE: If your network changes, that is, if you go from Library to Home, and you have a new IP address, i.e, new public facing IP address, then you will need to update this from the AWS console as well.
Next, we are also going to add one more
Custom TCP rule, this is the port of the Jenkins Console which uses Java, and their default naming convention is
8080 for a 80 port style application. You are going to set this again to
MyIP and create/launch the EC2 instance.
At this point, you will be prompted to setup your SSH Key-Pair. Note, this is different from the Access Key and Secret Key that you downloaded - these are for aws cli tools, which are nothing but your programmatic access keys.
Awesome! you can now see the instance got created.
We are now going to install Jenkins on this EC2 Instance.
Install Jenkins on EC2
Great, so we now have our new AWS EC2 instance, and we have also downloaded the Key Pair: sk_devops, we are now going to use this key pair and login to that EC2 instance.
So, first things first, we are going to ensure that we are using the latest and greatest version. So lets first do a
sudo apt update.
ubuntu@ip-172-31-47-188:~$ sudo apt update Fetched 18.4 MB in 4s (4853 kB/s) Reading package lists... Done Building dependency tree Reading state information... Done 49 packages can be upgraded. Run 'apt list --upgradable' to see them. ubuntu@ip-172-31-47-188:~$
Installing Jenkins Dependencies: Jenkins is a Java application, so let’s first install the
default-jdk. Shown below are all the steps to run in sequence.
sudo apt-get update sudo apt install -y default-jdk wget -q -O - https://pkg.jenkins.io/debian/jenkins.io.key | sudo apt-key add - sudo sh -c 'echo deb https://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list' sudo apt-get update sudo apt-get install -y jenkins
Verify Jenkins is running:
ubuntu@ip-172-31-47-188:~$ ps -ef | grep java jenkins 7576 1 0 00:48 ? 00:00:00 /usr/bin/daemon --name=jenkins --inherit --env=JENKINS_HOME=/var/lib/jenkins --output=/var/log/jenkins/jenkins.log --pidfile=/var/run/jenkins/jenkins.pid -- /usr/bin/java -Djava.awt.headless=true -jar /usr/share/jenkins/jenkins.war --webroot=/var/cache/jenkins/war --httpPort=8080 jenkins 7577 7576 43 00:48 ? 00:00:29 /usr/bin/java -Djava.awt.headless=true -jar /usr/share/jenkins/jenkins.war --webroot=/var/cache/jenkins/war --httpPort=8080 ubuntu 7798 1764 0 00:49 pts/0 00:00:00 grep --color=auto java ubuntu@ip-172-31-47-188:~$
Login to Jenkins console
So, until now, we have installed Jenkins on the command line. Let’s now go and login for the first time into the GUI. Take the url from your EC2 instance append 8080. You will a prompt to enter the passwd for Jenkins, follow the instructions and login. Once you login, you will be asked if you want to install recommended plugins or choose manually. I chose install recommended plugins. Then you will be prompted to create a new user. I created
sk_jenkins as the user to login to the Jenkins console, with my passwd.
Next, we need to configure Jenkins. As you can see, there is are a multitude of options here. For now, we are going to configure some plugins. Click on
Manage Plugins screen, click on the
Available tab, and search for Blue Ocean and select the following options
Finally, click on Install without Restart. This will show you the progress of installing each of these plugins. At the end, just refresh and click on Restart Jenkins.
Give it some time, it will prompt you to login again. At this point, we have Blue Ocean Plugin installed, and we had to restart jenkins in order to load those binaries correctly.
So, once you login, on the left side you will see a Open Blue Ocean link. This Blue Ocean can be used to create our Pipelines.
Let’s see how CI and CD is done in the DevOps space. First, lets see what the world was like before CI and CD. In ancient times, we had tools to automate this process, a lot of which was just deploying binaries to servers. Like a typical Friday evening deployments using bundles etc. As you know, this came with a lot of brittleness and extremely time consuming. Imagine doing SESF deployment. So that’s where we have the need for a tool like Jenkins to manage these.
With Jenkins, we have a huge number of plugins that allow us to store our configurations as code, in github. And to create pipelines - which we are going to do with Blue Ocean. There are many alternatives to Blue Ocean for building pipelines including Jenkins platform as well.
“Continuous Integration (CI) is a development practice where developers integrate code into a shared repository frequently, preferably several times a day. Each integration can then be verified by an automated build and automated tests.”
“Continuous deployment (CD) is a strategy for software releases wherein any code commit that passes the automated testing phase is automatically released into the production environment, making changes that are visible to the software’s users.”
So, lets just take a 10000 foot view of what that means, shown here is pictorial representation of the CI/CD process. Starting with the user who checks in a change to a git repo for a new feature that you have added to a product. Then, the pull request is reviewed by 2 of your collegues, and it gets merged back into your repository and then Jenkins is able to start the pipeline. Then, within this pipeline, we have our code linting process, then after we have successfully linted, we want a security check, and finally actually deploy it. This is mainly for our development and our staging environments. Our production environments usually don’t have Continuous Deploy because there is some risk involved.
One of the key best practices of DevOps is to be able to do “Infrastructure as Code”.
A pipeline enables us to store our Jenkins project configuration as code in a Git repository.
The previous way of doing this was to store the configurations as text on the Jenkins server. However, it is far superior to store this in a Git repository, because that way we version it, review it, perform pull requests, and integrate it just like the rest of our code.
A pipeline contains steps which have different actions performed as part of those steps.
You’ve learned a lot in this lesson! Here are some of the key skills and topics we covered in the course so far:
- Set up IAM user, role, group & policy
- Launched EC2 instance
- Installed Jenkins
- Enabled BlueOcean
- Showed the components of CI/CD
- Described a pipeline
Finally, once you login to Jenkins and Click on
Blue Ocean, you should see the following screen. In the next post, we will dive into creating pipelines using this.