This repository contains the necessary scripts and configurations to deploy Ollama, Open WebUI, and the Deepseek model using Docker and Docker Compose. The deployment is automated using Pulumi for infrastructure as code (IaC) on AWS.
- Docker: Ensure Docker is installed on your system.
- Docker Compose: Ensure Docker Compose is installed.
- Pulumi: Ensure Pulumi is installed and configured with your AWS credentials.
- AWS Account: Ensure you have an AWS account with the necessary permissions to create EC2 instances, security groups, and other resources.
git clone <repository-url>
cd <repository-directory>Ensure you have the necessary configuration values set in config.ts or Pulumi.<stack-name>.yaml. These configurations include:
serviceName: Name of the service.vpcid: VPC ID where the instance will be deployed.instanceType: EC2 instance type.subnetid: Subnet ID where the instance will be deployed.amiid: AMI ID for the EC2 instance.keypair: Key pair for SSH access.rtvolumeSize: Root volume size for the EC2 instance.
Run the following command to deploy the infrastructure using Pulumi:
pulumi upThis command will create an EC2 instance with Docker and Docker Compose installed, and it will deploy the services defined in the docker-compose.yaml file.
Once the deployment is complete, you can access the services as follows:
- Open WebUI: Open your browser and navigate to
http://<public-ip>. The public IP will be output by Pulumi after the deployment. - Ollama: The Ollama service will be running on port
11434.
The Deepseek model will be automatically pulled and deployed by the deepseek-pull service defined in the docker-compose.yaml file. You can verify the model is available by running:
sudo docker exec -it ollama ollama listYou can manage the services using Docker Compose commands:
-
Start the services:
docker-compose up -d
-
Stop the services:
docker-compose down
-
View logs:
docker-compose logs -f
The docker-compose.yaml file defines three services:
- Ollama: The Ollama service running on port
11434. - Open WebUI: The Open WebUI service running on port
80. - Deepseek-pull: A service that pulls the Deepseek model (
deepseek-r1:7b) and ensures it is available for use.
ollama: Volume for Ollama data.open-webui: Volume for Open WebUI data.
ollama-network: A bridge network for communication between the services.
The security group allows inbound traffic on ports 22 (SSH), 80 (HTTP), and 443 (HTTPS). It also allows all outbound traffic.
The EC2 instance is configured with a user data script that:
- Updates the system and installs Docker and Docker Compose.
- Adds the current user to the Docker group.
- Restarts and enables Docker.
- Writes the
docker-compose.yamlfile to the instance. - Starts the Docker Compose services.
- Docker Permissions: If you encounter permission issues, ensure the user has the necessary permissions to access Docker.
- Service Health: The
deepseek-pullservice depends on theopen-webuiservice being healthy. If theopen-webuiservice fails, thedeepseek-pullservice will not start.
This project is licensed under the MIT License. See the LICENSE file for details.
https://github.com/open-webui/open-webui
For any issues or questions, please open an issue in the repository.