-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Add xAI Grok as an LLM provider #54
Copy link
Copy link
Open
Labels
enhancementNew feature or requestNew feature or requesthelp wantedExtra attention is neededExtra attention is needed
Description
Summary
Add xAI Grok as a supported LLM provider in Crucix.
Crucix already supports multiple providers (anthropic, openai, gemini, codex, openrouter, minimax, mistral). Grok would be a useful addition for contributors and operators who already use xAI models and want another option for briefing synthesis, alert evaluation, and idea generation.
Proposed scope
- add
grok/ xAI provider wiring to the LLM abstraction - support environment-based configuration in the same style as existing providers
- define a sensible default model
- document setup in
README.md/ env docs - add provider tests comparable to the other LLM integrations
Expected implementation areas
- provider selection / dispatch
- request formatting and auth headers
- model default and model override handling
- any provider-specific response parsing
- tests for normal request/response behavior
- optional env-gated integration test if the repo pattern already supports that
Acceptance criteria
LLM_PROVIDER=grokworks for the same flows as other providers- failures degrade gracefully and do not crash sweeps
- docs explain required env vars and model defaults
- tests pass and match the repo?s current LLM provider patterns
Notes
Please follow the existing provider implementation style instead of introducing a one-off path. If someone in the community wants to pick this up, comment on the issue and go for it.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requesthelp wantedExtra attention is neededExtra attention is needed