πΎ A modern backup scheduling application with support for various backup targets including local directories, SFTP, SMB, Dropbox, and Google Drive
Left to right: Dashboard, Create Schedule, Backup Targets
BackupPro is a comprehensive backup scheduling solution designed to simplify the process of creating, managing, and monitoring backups across various storage targets. With an intuitive web interface, it allows users to set up automated backup schedules, track backup status, and ensure data safety with minimal effort.
- π Flexible Scheduling: Create backup schedules with customizable days and times
- π― Multiple Backup Targets: Support for local directories, SFTP, SMB, Dropbox, and Google Drive
- π Dashboard Overview: Monitor backup status, storage usage, and success rates at a glance
- π Status Notifications: Get informed about backup successes and failures
- π Manual Backups: Trigger immediate backups when needed
- π File Explorer: Browse your file system to select backup sources
- π Cloud Integration: Connect to popular cloud storage services
- π Secure Connections: SFTP support with password or key-based authentication
- π Dark Mode: Eye-friendly interface for day and night use
- π Detailed Logs: Track all backup activities with comprehensive logs
- Docker and Docker Compose
- Node.js (for development)
- Supported platforms:
- x86/AMD64 (standard PCs and servers)
- ARM64 (Raspberry Pi 4 or newer with 64-bit OS)
-
Clone the repository:
git clone https://github.com/yourusername/BackupPro.git cd BackupPro
-
Configure the application:
cp .env.example .env # Edit the .env file to match your environment
-
Start the Docker containers:
docker-compose up -d
-
Access the application at http://localhost:3000
You can also use pre-built Docker images from Docker Hub:
-
Download the Docker Compose file:
curl -O https://raw.githubusercontent.com/bangertech/backup-pro/master/docker-compose.yml
-
Create a
.env
file with your settings:cat > .env << 'EOL' # Timezone settings TZ=Europe/Berlin # Ports for the different services FRONTEND_PORT=3000 BACKEND_PORT=4000 POSTGRES_PORT=5432 # Database settings POSTGRES_USER=postgres POSTGRES_PASSWORD=postgres POSTGRES_DB=backup_schedule # Application settings NODE_ENV=production FILE_EXPLORER_BASE_DIR=/host_fs EOL
-
Start the application:
docker compose up -d
-
Access the application at http://localhost:3000
For development purposes, you can use the development Docker Compose configuration:
docker compose -f docker-compose.dev.yml up
This will build the images locally instead of using the pre-built ones from Docker Hub.
To update to the latest version when using Docker Hub images:
docker compose pull
docker compose up -d
BackupPro can be configured using environment variables in the .env
file:
# Timezone settings
# Set your desired timezone here
TZ=Europe/Berlin
# Network settings
# Set your local IP address or hostname here
HOST_IP=192.168.2.86
# Ports for the different services
FRONTEND_PORT=3000
BACKEND_PORT=4000
POSTGRES_PORT=5432
# Database settings
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=backup_schedule
# Application settings
NODE_ENV=production
FILE_EXPLORER_BASE_DIR=/host_fs
The timezone setting (TZ
) is used across all components (frontend, backend, and PostgreSQL) to ensure consistent time display and scheduling. Examples of valid timezones:
Europe/Berlin
(Germany)Europe/London
(UK)America/New_York
(US Eastern)America/Los_Angeles
(US Pacific)Asia/Tokyo
(Japan)
For a complete list of timezones, see this reference.
When adding cloud storage targets like Dropbox or Google Drive:
- Navigate to "Targets" and click "Create New Target"
- Select the target type (e.g., Dropbox or Google Drive)
- Enter the required OAuth credentials (Client ID and Secret)
- These credentials are securely stored in the database for future operations
- Navigate to the "Targets" section
- Click "Create New Target"
- Select the target type (Local, SFTP, Dropbox, Google Drive)
- Fill in the required information:
- For local targets: Path to the backup destination
- For SFTP: Server details, credentials, and path
- For cloud services: Connect via OAuth
- Go to the "Schedules" section
- Click "Create Schedule"
- Configure your schedule:
- Name your backup schedule
- Select the source directory to back up
- Choose a backup target
- Set the schedule (days of week and time)
- Enable/disable the schedule as needed
The Dashboard provides an overview of:
- Total backups performed
- Active schedules
- Storage usage
- Success rate
- Recent backup status
Click on any metric for detailed information.
Backup fails with "Permission denied"
- Check that the application has read access to the source directory
- Verify write permissions on the target location
Schedule not running at the expected time
- Confirm the timezone setting in your
.env
file - Check that the schedule is marked as "Active"
- Verify the server time with
docker-compose exec backend date
Cannot connect to cloud services
- Ensure your OAuth credentials are correct
- Check your internet connection
- Verify that the service APIs are accessible from your server
To view application logs:
# Backend logs
docker-compose logs backend
# Frontend logs
docker-compose logs frontend
# Database logs
docker-compose logs postgres
-
Clone the repository:
git clone https://github.com/yourusername/BackupPro.git cd BackupPro
-
Install dependencies:
# Frontend cd frontend npm install # Backend cd ../backend npm install
-
Start development servers:
# Frontend cd frontend npm run dev # Backend cd ../backend npm run dev
This project includes a GitHub Actions workflow that automatically builds and pushes Docker images to Docker Hub when changes are pushed to the master branch. To set up this workflow:
- Create a Docker Hub account and repository named
bangertech/backup-pro
- In your GitHub repository, add the following secrets:
DOCKERHUB_USERNAME
: Your Docker Hub usernameDOCKERHUB_TOKEN
: A Docker Hub access token with read/write permissions
The workflow will:
- Build multi-architecture images (AMD64 and ARM64)
- Push them to Docker Hub with appropriate tags
- Create a production-ready Docker Compose file
You can also manually trigger the workflow from the Actions tab in your GitHub repository.
/frontend
- React.js frontend application/backend
- Node.js backend API/db
- Database migrations and seeds
MIT
backup
scheduler
automation
data-protection
cloud-backup
sftp
dropbox
google-drive
docker
react
node.js
typescript
postgresql
backup-management
web-interface