This project automates the setup of a DevOps pipeline for CI/CD and code quality management using Terraform, Ansible, and Docker Compose. The infrastructure is provisioned on AWS, with Terraform creating a VPC, subnet, and a t2.micro EC2 instance. Ansible is used to configure the EC2 instance by installing Docker and deploying services like Jenkins and SonarQube. Docker Compose orchestrates the deployment of Jenkins, SonarQube, and a PostgreSQL database for SonarQube. This end-to-end setup streamlines the development process, enabling efficient CI/CD workflows and static code analysis in a cloud environment. This project was presented to the DevOps team at Tkxel as part of my final internship assignment.
Directory: '.github/workflows' The CI/CD workflow is defined in the ci-cd.yml file. It is executed on every push or pull request made to the main branch. The workflow includes the following jobs:
-
Terraform: This job is responsible for provisioning AWS resources using Terraform. It sets up the required providers, initializes the VPC, creates subnets, attaches an internet gateway and configures route tables.
-
Ansible: This job is triggered after the Terraform job completes. It uses Ansible to copy the docker-compose.yml file to the remote system and runs it.
Each job is executed on an Ubuntu virtual machine, and the necessary configurations and dependencies are set up through the running the scripts.
Directory: 'terraform/' The Terraform script is divided into multiple files which define the AWS resources and configurations required for the VPC setup.
backend.tf
: This file defines the backend configuration for storing Terraform state remotely in an S3 bucket. The state file will be stored in thedevops-superhero-bucket
bucket under theterraformstate.tfstate
key.variables.tf
: This file defines the variables used across the Terraform configuration, such as CIDR blocks for the VPC and subnet, availability zones, and instance types.main.tf
: The main Terraform configuration file defines the infrastructure as code. It includes resources for provisioning an AWS VPC, public subnet, internet gateway, security group, and EC2 instance.
The script performs the following tasks:
- Defines the AWS region (
us-east-1
). - Retrieves an SSH private key stored in AWS Secrets Manager.
- Creates a VPC with the CIDR block defined in
variables.tf
. - Defines a public subnet and associates it with the VPC and the internet gateway.
- Opens ports for SSH (22), HTTP (80), Jenkins (8080), and other necessary services.
- Deploys a
t2.micro
EC2 instance using the Ubuntu AMI and connects it to the created security group and subnet.
Directory: 'ansible/'
The Ansible Playbook, defined in the docker-setup.yml
, is triggered after the Terraform script is completed and is responsible for installing Docker and Docker Compose on the EC2 instance and deploys the Docker containers defined in docker-compose.yml
.
The script performs the following tasks:
- Installs necessary packages for Docker installation.
- Adds the Docker GPG key to verify the authenticity of the Docker package.
- Installs Docker and Docker Compose on the EC2 instance.
- Ensures that the Docker service is running and enabled.
- Uses Docker Compose to deploy the containers defined in
docker-compose.yml
.
Directory: 'ansible/' The docker-compose.yml file defines the services and configurations required for running the Jenkins and SonarQube containers. It includes:
-
Jenkins: This service is based on the "jenkins/jenkins:lts" image. It runs on port 8080 and 50000 and mounts the Jenkins configuration and Docker socket volumes for persistence and access to the host's Docker daemon.
-
SonarQube: This service is based on the "sonarqube:latest" image. It runs on port 9000 and depends on the "db" service. It sets environment variables for the SonarQube database connection.
-
DB: This service is based on the "postgres:latest" image. It runs a PostgreSQL database for SonarQube and sets the necessary environment variables.
The docker-compose.yml file ensures that the Jenkins and SonarQube containers are properly configured and running on the remote system. Make sure that EC2 instance has opened the ingress ports mentioned above.
In this documentation, we have provided a detailed explanation of the CI/CD pipeline implemented in this project. The pipeline includes GitHub Actions for triggering the workflow, which runs Terraform for provisioning AWS resources, at last triggers Ansible for configuring the remote system and pinging Docker Compose for running the application containers.
By following this pipeline, you can automate the deployment and configuration of your application. Add 'nginx' and complete SonarQube and Jenkins Pipeline to Deploy your app. Super time saving and ensuring consistency in your DevOps processes!
- Ensure the AWS credentials are correctly set and that the S3 bucket is properly initialized to avoid creating new resources on every pipeline run.
- Ensure the correct public IP is added to the
inventory.ini
file and that the SSH keys in GitHub Secrets match the EC2 instance's key pair. - 'docker-compose.yml' is stored 'locally' and then copied to remote system through ansible.
- 'docker-compose.yml' allows Jenkins to host on the remote system in this project. Beware! it is a bad practice and is depriciated.
Before using this repository, ensure you have the following:
-
AWS Account:
- Access to create and manage resources like VPCs, EC2 instances, and S3 buckets.
- IAM credentials (Access Key and Secret Key) for configuring Terraform.
-
Local System Requirements:
- Terraform: Installed and configured (version 1.0.3 or later).
- Ansible: Installed (latest version recommended).
- Docker: Installed and running, with Docker Compose installed.
-
GitHub Account:
- A repository with configured GitHub Actions.
- Secrets for AWS credentials added to GitHub:
TF_USER_AWS_KEY
: AWS Access Key.TF_USER_AWS_SECRET
: AWS Secret Key.
-
SSH Key Pair:
- A valid SSH key pair for accessing the EC2 instance.
- The private key securely stored for use with Ansible.
-
Required Tools:
- Git: To clone the repository and manage version control.
- Python and pip: Required for managing Ansible and its dependencies.
-
Environment Configurations:
- Ensure
terraform
anddocker-compose
commands are accessible via the terminal. - Sufficient permissions on your local system to install packages and run scripts.
- Ensure
Follow these steps to set up and run the project:
git clone https://github.com/hafeez381/devops-superhero-project.git
cd devops-superhero-project
Update the backend.tf
file with your S3 bucket name and region.
Modify the main.tf
and variables.tf
files with your specific configurations.
git add .
git commit -m "Initial setup"
git push origin dev
The pipeline will automatically run on every pull request to the main branch.
Here are some useful references for further reading:
- Terraform Documentation
- Ansible Documentation
- Docker Compose Documentation
- GitHub Actions Documentation