Skip to content

Add xAI Grok as an LLM provider #54

@calesthio

Description

@calesthio

Summary

Add xAI Grok as a supported LLM provider in Crucix.

Crucix already supports multiple providers (anthropic, openai, gemini, codex, openrouter, minimax, mistral). Grok would be a useful addition for contributors and operators who already use xAI models and want another option for briefing synthesis, alert evaluation, and idea generation.

Proposed scope

  • add grok / xAI provider wiring to the LLM abstraction
  • support environment-based configuration in the same style as existing providers
  • define a sensible default model
  • document setup in README.md / env docs
  • add provider tests comparable to the other LLM integrations

Expected implementation areas

  • provider selection / dispatch
  • request formatting and auth headers
  • model default and model override handling
  • any provider-specific response parsing
  • tests for normal request/response behavior
  • optional env-gated integration test if the repo pattern already supports that

Acceptance criteria

  • LLM_PROVIDER=grok works for the same flows as other providers
  • failures degrade gracefully and do not crash sweeps
  • docs explain required env vars and model defaults
  • tests pass and match the repo?s current LLM provider patterns

Notes

Please follow the existing provider implementation style instead of introducing a one-off path. If someone in the community wants to pick this up, comment on the issue and go for it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions