In Part 1 of this blog, we saw how AWS Code Pipeline and Code Build were leveraged for our golden AMI pipelines and how Cloud Formation saved us time and effort in replicating the pipelines. We discussed the Packer configuration and the pipeline workflow. In this part, let’s see how we implemented the Open-source solution using Jenkins pipelines with Packer and Ansible to automate the golden AMI baking.
The customer’s application, built in Ruby, was deployed on AWS EC2 (Amazon Linux) instances and managed using AWS OpsWorks for deployments. However, their autoscaling process took approximately 30–40 minutes, leading them to rely on scheduled autoscaling—incurring additional costs. Looking to reduce autoscaling time and adopt a real-time scaling approach, the customer also expressed the need to move away from OpsWorks to an open-source alternative.
As a trusted cloud consulting company, CloudifyOps provided a strategic solution through the implementation of a Golden AMI-based approach. Leveraging our cloud infrastructure services, we set up Jenkins pipelines to create Golden AMIs that included all necessary application dependencies and code. These pre-baked AMIs were then used for autoscaling, drastically improving the scaling speed and enabling real-time autoscaling capabilities.
This blog takes you through the solution in detail, showcasing how our expertise in cloud infrastructure services helped the customer achieve a faster, cost-effective, and more reliable autoscaling setup—fully aligned with their performance and deployment goals.
The above diagram shows the pipeline flow and the components used. The customer had the code hosted in GitHub. We had the Jenkins pipeline in place with the first stage being the code checkout. Jenkins then validates the Packer file and on a successful validation invokes the Packer build. Packer then does the AMI baking for us. Once the AMI is baked, Jenkins sends out a post build notification to the stakeholder. The customer was already using Sendgrid SMTP setup, so we leveraged the same for sending out post build notifications.
Writing the jenkinsfile with lesser stages and complexity was easy as most of the heavy lifting was done by Packer. Below is the sample jenkinsfile which we used.
pipeline {
agent any
environment {
***** the env variables goes here
}
stages {
stage (“1. Checking out Ansible & AMI repos”) {
steps {
dir (“ansible”) {
git branch: ‘*****’, credentialsId: ‘*****-github’, url: ‘https://github.com/*********.git’
}
dir (“ami”) {
git branch: ‘******’, credentialsId: ‘*****-github’, url: ‘https://github.com/*****.git’
}
}
}
stage (‘2. Baking Base AMI’) {
steps {
script {
dir (“ami”) {
sh ‘’’
sudo rm -f build.log ami_id.txt
sudo packer validate packer_application.json
sudo packer build packer_application.json | tee build.log
touch ami_id.txt”
sudo grep “${REGION}” build.log | cut -d ‘:’ -f2 > ami_id.txt
// Packer doesn’t return non-zero status; we must do that if Packer build failed
test -s ami_id.txt || echo ‘Packer build failed’
test -s ami_id.txt || exit 1
‘’’
env.AMI_ID = sh ( script: “cat ami_id.txt”,returnStdout: true )
}
}
}
}
}
post {
always {
emailext (
body: “””
Hi,
Base AMI Builder Pipeline Build completed, Please Go to the Below URL to see more details:
${env.BUILD_URL}
Build Number — ${env.BUILD_NUMBER}
Base AMI ID : ${env.AMI_ID}
”””,
mimeType: ‘text/html’,
recipientProviders: [*******()],
subject: “Base AMI Builder Pipeline Build Status is ${currentBuild.result}”,
to: ‘devops@****.com’)
}
}
}
As discussed in part one of the blog, Packer does the heavy lifting here. The only change is we have used the AMI filters. Since we are using Amazon Linux, there will be a slight change.
“builders”: [
{
“type”: “amazon-ebs”,
“name”: “Base_AMI”,
“region”: “{{user `aws_region`}}”,
“source_ami_filter”: {
“filters”: {
“virtualization-type”: “hvm”,
“name”: “amzn2-ami-hvm-2.0*”,
“root-device-type”: “ebs”
},
“owners”: [“amazon”],
“most_recent”: true
},
Here, we can observe the changes in the filter. Apart from this change, the Packer configuration/flow remains the same.
The previous project gave us good exposure to golden AMI pipelines on AWS DevOps. We used the same strategy here to overcome the AMI challenge.
The base AMI was packed tight with the ruby, rails packages and dependencies, the AWS unified CloudWatch agent, custom CloudWatch rule for the memory metrics, alien vault agent, CrowdStrike agent, etc. We designed pipelines for the mighty base AMI, and other three application-specific AMIs.
The customer wanted the solution to be cloud agnostic as they had plans for a multi-cloud architecture. At the same time, they wanted the solution to not depend on a source control platform like GitHub action as they have plans to migrate code base to bitbucket. Jenkins as the CI platform is cloud agnostic, independent of any source control platforms and the Jenkins vast plugin library allows for seamless third party tool integration. We also had the infra provisioning pipelines and the DR pipelines in the scope of work, so managing a Jenkins server was a clear selection for us.
To know more about how the CloudifyOps team can help you with security compliance solutions, write to us today at sales@cloudifyops.com
CloudifyOps Pvt Ltd, Ground Floor, Block C, DSR Techno Cube, Survey No.68, Varthur Rd, Thubarahalli, Bengaluru, Karnataka 560037
Cove Offices OMR, 10th Floor, Prince Infocity 1, Old Mahabalipuram Road, 50,1st Street, Kandhanchavadi, Perungudi, Chennai, Tamil Nadu - 600096
CloudifyOps Inc.,
200, Continental Dr Suite 401,
Newark, Delaware 19713,
United States of America
Copyright 2024 CloudifyOps. All Rights Reserved