Skip to content

Failed to initialize database #81

@Chuckame

Description

@Chuckame

I'm getting this error when starting the comfyui container: Failed to initialize database. Please ensure you have installed the latest requirements. If the error persists, please report this as in future the database will be required: (sqlite3.OperationalError) unable to open database file

There is no issue when using webui/ollama for classical textual chat, so the gpu passthrough seems to work.

I never used comfyui, so I'm sorry if I failed on the usage of it!

Then when trying to execute a workflow:

got prompt
Warning, This is not a checkpoint file, trying to load it as a diffusion model only.
model weight dtype torch.bfloat16, manual cast: None
model_type FLOW
/comfyui-nvidia_init.bash: line 732:   175 Killed                  ${COMFY_CMDLINE_BASE} ${COMFY_CMDLINE_EXTRA}
!! ERROR: ComfyUI failed or exited with an error
!! Exiting script (ID: 33)
!! ERROR: subscript failed
!! Exiting script (ID: 1)

I'm currently executing this stack in an LXC with gpu passthrough in proxmox.

LXC versions
# nvidia-smi --version
NVIDIA-SMI version  : 580.95.05
NVML version        : 580.95
DRIVER version      : 580.95.05
CUDA Version        : 13.0

# docker --version
Docker version 28.5.0, build 887030f

# uname -a
Linux docker 6.14.8-3-bpo12-pve #1 SMP PREEMPT_DYNAMIC PMX 6.14.8-3~bpo12+1 (2025-09-12T11:29Z) x86_64 GNU/Linux
Proxmox versions
# nvidia-smi --version
NVIDIA-SMI version  : 580.95.05
NVML version        : 580.95
DRIVER version      : 580.95.05
CUDA Version        : 13.0

# uname -a
Linux pve2 6.14.8-3-bpo12-pve #1 SMP PREEMPT_DYNAMIC PMX 6.14.8-3~bpo12+1 (2025-09-12T11:29Z) x86_64 GNU/Linux
docker compose content
services:
  ollama:
    image: ollama/ollama:latest
    volumes:
      - ollama_data:/root/.ollama
    restart: always
    ports:
      - 11434:11434
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu]

  ollama-webui:
    image: ghcr.io/open-webui/open-webui:main
    volumes:
      - ollama_webui:/app/backend/data
    depends_on:
      - ollama
      - comfyui
    ports:
      - 8080:8080
    environment:
      OLLAMA_BASE_URL: http://ollama:11434
      RAG_EMBEDDING_ENGINE: ollama
      RAG_EMBEDDING_MODEL_AUTO_UPDATE: True

      WHISPER_LANGUAGE: fr

      ENABLE_IMAGE_GENERATION: True
      IMAGE_GENERATION_ENGINE: comfyui
      COMFYUI_BASE_URL: http://comfyui:8188
      IMAGE_GENERATION_MODEL: flux1-schnell.safetensors
      # COMFYUI_API_KEY: TBD in ui
    restart: always

  comfyui:
    image: mmartial/comfyui-nvidia-docker:ubuntu24_cuda13.0-latest
    ports:
      - 8188:8188
    volumes:
      - comfyui_data:/app/basedir
      - comfyui_run:/comfy/mnt
    restart: always
    environment:
      WANTED_UID: 0
      WANTED_GID: 0
      BASE_DIRECTORY: /app/basedir
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu, compute, utility]

  filebrowser:
    image: filebrowser/filebrowser:s6
    ports:
      - 8081:80
    restart: always
    volumes:
      - comfyui_data:/srv/comfyui_data
      - filebrowser_db:/database
      - filebrowser_config:/config
    environment:
      PUID: 0
      PGID: 0

volumes:
  ollama_data:
  ollama_webui:
  comfyui_data:
  comfyui_run:
  filebrowser_config:
  filebrowser_db:

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions