Open PowerShell as Administrator and run:
wsl --installIf WSL is already installed, update it:
wsl --updateReboot your system if prompted.
- Open Microsoft Store
- Search for Ubuntu
- Choose a version (e.g., Ubuntu 22.04 LTS)
- Click Get or Install
- Launch Ubuntu from Start Menu and set up username/password
Run the following commands in the Ubuntu terminal:
# 1. Update package index and install dependencies
sudo apt update
sudo apt install ca-certificates curl gnupg lsb-release -y
# 2. Add Docker’s official GPG key
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
# 3. Set up the Docker repository
echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] \
https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | \
sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# 4. Install Docker Engine
sudo apt update
sudo apt install docker-ce docker-ce-cli containerd.io docker-buildx-plugin docker-compose-plugin -y
# 5. Add user to docker group (optional but recommended)
sudo usermod -aG docker $USER🔁 Restart the Ubuntu terminal after running the above to apply group changes.
✅ You can now run Docker inside Ubuntu WSL:
docker --versiongit clone https://github.com/data-guru0/LLMOPS-TESTING-1.git
cd LLMOPS-TESTING-1python -m venv venv
venv\Scripts\activateInstall the required libraries using:
pip install -e .python app/main.pyThe following essential setup steps have been completed:
-
✅ WSL Setup Full
- Ubuntu installed via Microsoft Store
- Docker Engine installed inside Ubuntu WSL
- Project runs successfully in WSL
-
✅ Dockerfile Created
- Dockerfile written for the project
- Environment variables setup will be handled later
- Do not include
.envin the Dockerfile for now
-
✅ GitHub Setup Completed
- Project is pushed to GitHub
.gitignoreis properly configured and includes.env
🟢 You are now ready to move forward with Deployment phase.
Follow the steps below to deploy the application.
- Make sure you run commands inside a WSL terminal in VS Code
Follow the steps below to set up Jenkins inside a Docker container and configure it for the project:
Build the Docker image for Jenkins:
docker build -t jenkins-dind .Run the Jenkins container with the following command:
docker run -d --name jenkins-dind \
--privileged \
-p 8080:8080 -p 50000:50000 \
-v /var/run/docker.sock:/var/run/docker.sock \
-v jenkins_home:/var/jenkins_home \
jenkins-dindAfter successful execution, you'll receive a long alphanumeric string.
To verify if the Jenkins container is running:
docker psTo retrieve Jenkins logs and get the initial admin password:
docker logs jenkins-dindYou should see a password in the output. Copy that password.
Run the following command to get the IP address of your WSL environment:
ip addr show eth0 | grep inetNow, access Jenkins on your browser using the following URL (replace 172.23.129.123 with the actual WSL IP address you retrieved):
http://172.23.129.123:8080
Return to the terminal and run the following commands to install Python inside the Jenkins container:
docker exec -u root -it jenkins-dind bash
apt update -y
apt install -y python3
python3 --version
ln -s /usr/bin/python3 /usr/bin/python
python --version
apt install -y python3-pip
exitRestart the Jenkins container to apply changes:
docker restart jenkins-dindGo to the Jenkins dashboard and sign in using the initial password you retrieved earlier.
Follow the steps below to integrate GitHub with Jenkins for automated pipeline execution:
- Go to GitHub.
- Navigate to Settings -> Developer Settings -> Personal Access Tokens -> Classic.
- Click on Generate New Token.
- Provide a name and select the following permissions:
repo(for repository access)repo_hook(for hook access)
- Click Generate Token.
- Save the token securely somewhere (you will not be able to view it again after this page).
- Go to the Jenkins Dashboard.
- Click Manage Jenkins -> Manage Credentials -> Global.
- Click Add Credentials.
- In the Username field, enter your GitHub account name.
- In the Password field, paste the GitHub token you just generated.
- In the ID field, enter a name for this credential (e.g.,
github-token). - Add a Description (e.g.,
GitHub access token). - Click OK to save the credentials.
- Go to the Jenkins Dashboard.
- Click on New Item.
- Select Pipeline and provide a name for the job.
- Click Apply and then Create.
- On the left sidebar of the Jenkins job, click Pipeline Syntax.
- Under Step, select checkout.
- Fill in the necessary details, such as:
- Repository URL (your GitHub repository URL)
- Credentials (select the
github-tokencreated earlier)
- Click Generate Pipeline Script.
- Copy the generated script.
- Open VS Code and create a file named
Jenkinsfile( already done if cloned ) - For now only keep the first stage of Jenkinsfile rest should be commendted out.
Explanation: This simple pipeline has one stage, Checkout, where Jenkins will fetch the latest code from your GitHub repository.
- Push the
Jenkinsfileto your GitHub repository.
- Go back to the Jenkins Dashboard.
- Click on Build Now for your pipeline job.
- Wait for the build process to complete.
Once the pipeline finishes, you will see a success message, indicating that your first pipeline run was successful. Additionally, in the Workspace of the job, you will see that Jenkins has cloned your GitHub repository.
Follow these steps to integrate SonarQube with Jenkins for code quality analysis.
- Go to DockerHub and search for SonarQube. Scroll down to find the commands.
- Run the following commands in a new WSL terminal to configure the system:
sysctl -w vm.max_map_count=524288
sysctl -w fs.file-max=131072
ulimit -n 131072
ulimit -u 8192- Run the SonarQube container with the appropriate settings. Make sure to change the container name to
sonarqube-dindand remove the dollar sign ($) from the command. You will find the command in the Demo section of DockerHub.
docker run -d --name sonarqube-dind \
-p 9000:9000 \
-e SONARQUBE_JDBC_URL=jdbc:postgresql://localhost/sonar \
sonarqube- Check if the container is running:
docker ps- Access SonarQube on
http://<WSL_IP>:9000(replace<WSL_IP>with your WSL IP address). Log in using the default credentials:- Username:
admin - Password:
admin
- Username:
-
Go to Jenkins Dashboard -> Manage Jenkins -> Manage Plugins.
-
Install the following plugins:
- SonarScanner
- SonarQualityGates
-
Restart the Jenkins container:
docker restart jenkins-dind-
Go to SonarQube -> Create a Local Project.
- Enter a name for the project (e.g.,
LLMOPS). - Set the Main Branch.
- Save the project.
- Enter a name for the project (e.g.,
-
Go to SonarQube -> My Account (top-right) -> Security -> Generate New Token.
- Provide a name (e.g.,
global-analysis-token) and generate the token. - Copy the generated token.
- Provide a name (e.g.,
-
Go to Jenkins Dashboard -> Manage Jenkins -> Credentials -> Global.
-
Add a new Secret Text credential:
- ID:
sonarqube-token - Secret: Paste the token from SonarQube.
- Click OK to save.
- ID:
-
Go to Manage Jenkins -> System Configuration.
-
Scroll down to SonarQube Servers and click Add SonarQube.
- Name:
SonarQube(or any name you prefer) - URL:
http://<WSL_IP>:9000(replace<WSL_IP>with your actual IP address) - Select SonarQube Token from the credentials dropdown.
- Apply and save.
- Name:
-
Go to Manage Jenkins -> Tools and look for SonarQube Scanner.
- Select SonarQube Scanner and configure it.
- Tick the option Install Automatically.
-
Open the Jenkinsfile in VS Code and add the Sonarqube stage ( already provided in the code )
-
Push the changes to your GitHub repository.
- Run the following command to create a new Docker network:
docker network create dind-network- Connect both containers to the new network:
docker network connect dind-network jenkins-dind
docker network connect dind-network sonarqube-dind- Update the
Jenkinsfileto use the container name instead of the IP address: (already done in code )
-Dsonar.host.url=http://sonarqube-dind:9000- Trigger the Jenkins Pipeline .
- The build should now be successful, and the code will be analyzed by SonarQube.
Go to SonarQube and see the code quality report generated for your project.
Follow these steps to set up AWS integration with Jenkins for building and pushing Docker images to Amazon ECR.
-
Go to Manage Jenkins -> Manage Plugins.
-
Search for and install the following plugins:
- AWS SDK (All)
- AWS Credentials
-
Restart jenkins-dind after the plugins installation:
docker restart jenkins-dind-
Go to the AWS Console → IAM → Users → Add User.
-
Add the necessary policies:
- Attach the policy: AmazonEC2ContainerRegistryFullAccess
-
Once the user is created, select the user and click on Create Access Key.
-
Copy the Access Key ID and Secret Access Key.
-
Go to Jenkins Dashboard → Manage Jenkins → Manage Credentials → Global.
-
Add a new AWS Credentials:
- ID:
aws-credentials - Access Key ID: Paste the Access Key ID from AWS.
- Secret Access Key: Paste the Secret Access Key from AWS.
- ID:
-
Save the credentials.
- Open a new terminal and run the following commands inside your jenkins-dind container:
docker exec -u root -it jenkins-dind bash- Update the package list and install required tools:
apt update
apt install -y unzip curl- Download and install AWS CLI:
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
./aws/install- Verify the installation:
aws --version- Exit the container:
exit- Go to AWS Console → ECR (Elastic Container Registry) → Create Repository.
- Name the repository (e.g.,
my-repository). - Set up the repository as required and save the repository URL for later use.
Already done if clone just change according to your repo name..
Push the updated Jenkinsfile to your GitHub repository to trigger the pipeline.
- Go to the Jenkins Dashboard.
- Click on Build Now for your pipeline.
- The pipeline will execute, building the Docker image and pushing it to Amazon ECR.
✅ Congratulations! Your Docker image has been successfully built and pushed to Amazon ECR using Jenkins.
Follow these steps to deploy your app to AWS ECS Fargate using Jenkins and automate the deployment process.
-
Create ECS Cluster:
- Go to ECS → Clusters → Create Cluster.
- Give your cluster a name and select Fargate.
- Click Create to create the cluster.
-
Create ECS Task Definition:
- Go to ECS → Task Definitions → Create new Task Definition.
- Select Fargate as the launch type.
- Give the task definition a name (e.g.,
llmops-task).
-
Container Configuration:
- Under Container details, give the container a name and use the ECR URI (the Docker image URL from your ECR repository).
- In Port Mapping, use the following configuration:
- Port: 8501
- Protocol: TCP
- None: leave it as default.
-
Create Task Definition:
- Click Create to create the task definition.
- Go to ECS → Clusters → Your cluster.
- Click Create Service.
- Select your Task Definition (
llmops-task). - Select Fargate for launch type (this should be the default option).
- Give the service a name (e.g.,
llmops-service). - Under Networking, select:
- Public IP: Allow a public IP.
- Click Create and wait for a few minutes for the service to be deployed.
- Search for Security Groups in the AWS console.
- Select the Default security group.
- Go to the Inbound Rules and click Edit inbound rules.
- Add a new Custom TCP rule with the following details:
- Port range: 8501
- Source: 0.0.0.0/0 (allow access from all IPs).
- Save the rules.
- After the ECS service has been deployed (this may take a few minutes), go to your ECS cluster.
- Open the Tasks tab and copy the Public IP of your task.
- Open a browser and visit:
http://<PublicIP>:8501.- You should see your app running.
-
Add ECS Full Access Policy to the IAM user:
- Go to IAM → Users → Your IAM User → Attach Policies.
- Attach the AmazonEC2ContainerServiceFullAccess policy to the IAM user.
-
Update Jenkinsfile for ECS Deployment:
- Add the deployment stage to your
Jenkinsfile. This will automate the deployment of your Docker container to AWS ECS.
- Add the deployment stage to your
-
Push the updated code to GitHub.
- Go to Jenkins Dashboard.
- Click on Build Now to trigger the Jenkins pipeline.
- The pipeline will run, and you will see the task in the ECS Service go to In Progress.
- Once the pipeline is complete, your service will be Running again.
- Open the ECS cluster and check the Task status.
- After the task is successfully deployed, visit your app at
http://<PublicIP>:8501to ensure it is working.
- Go to your ECS Task Definition in the AWS Console.
- Edit the container definition.
- Scroll to the Environment Variables section.
- Add the following environment variables:
- GROQ_API_KEY:
gsk_... - TAVILY_API_KEY:
tvly-dev-...
- GROQ_API_KEY:
- Save the changes and redeploy the task.
Your app is now deployed to AWS ECS Fargate. You can access it via the public IP at port 8501. The deployment process has been automated using Jenkins, and the app is now live.