Skip to content

Add Support for Ollama Environment Checks in check_env.sh #26

@Zena4L

Description

@Zena4L

Title: Add Support for Ollama Environment Checks in check_env.sh

Description:
I’m using Embabel with the Ollama provider (embabel.llm.provider=ollama, embabel.llm.model=deepseek-r1:latest) to run a local LLM in my project . The current scripts/check_env.sh script checks for OPENAI_API_KEY and ANTHROPIC_API_KEY but exits with an error if both are missing, even when using Ollama, which doesn’t require these keys. This prevents the application from starting ./scripts/shell.sh when configured for Ollama.

Proposed Feature:
Add environment checks in scripts/check_env.sh (or related scripts) to detect embabel.llm.provider=ollama in application.properties and skip OpenAI/Anthropic API key checks if Ollama is configured. This would improve usability for users running local LLMs with Ollama.

Example Implementation:
Modify check_env.sh to include a check like:

if [ -f "src/main/resources/application.properties" ]; then
    if grep -q "embabel.llm.provider=ollama" "src/main/resources/application.properties"; then
        echo "Ollama provider detected. Skipping OpenAI and Anthropic API key checks."
        exit 0
    fi
fi

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions