Skip to content

Releases: ben-vargas/ai-sdk-provider-opencode-sdk

v3.0.1

24 Mar 08:23
c5ffd41

Choose a tag to compare

Added

  • User message ID passthrough - Added support for providerOptions.opencode.messageID to control the user message ID sent to OpenCode. Must start with "msg_". (PR #14 by @abhijit-hota)
  • Exported OpencodeProviderOptions type - New type in src/types.ts documenting the per-request provider options surface.

Fixed

  • Session creation error messages - Failed to create session now includes the server error payload for easier debugging.

v3.0.0

18 Mar 05:03
abdff6a

Choose a tag to compare

Breaking change

Model IDs containing multiple / separators now match OpenCode upstream parsing.

Previous behavior:

  • litellm/anthropic/claude-sonnet-4-6 was parsed as providerID = "litellm/anthropic" and modelID = "claude-sonnet-4-6"

New behavior in 3.0.0:

  • litellm/anthropic/claude-sonnet-4-6 is parsed as providerID = "litellm" and modelID = "anthropic/claude-sonnet-4-6"

This aligns the provider with OpenCode itself and fixes provider lookup failures for integrations where the model portion may contain slashes, including LiteLLM-style routes.

This is published as a major release so consumers pinned to ^2.x do not receive this behavioral change automatically.

Included in this release

  • Align multi-slash model ID parsing with OpenCode upstream
  • Add regression coverage for multi-slash model IDs, including a LiteLLM-style case
  • Update version compatibility documentation for the new 3.x line

v2.1.2

02 Mar 21:39
8986b60

Choose a tag to compare

Fixed

  • Streaming delta handling — Added support for message.part.delta events to enable true incremental text and reasoning streaming instead of batch delivery via message.part.updated only. (PR #9 by @abhijit-hota)
  • User-message filtering for deltas — Applied user-role guard to handlePartDelta to prevent user prompt text from leaking into assistant stream output, matching existing filtering in handlePartUpdated.

Changed

  • Dependencies — Bumped @opencode-ai/sdk from ^1.1.65 to ^1.2.15.

Full Changelog: v2.1.1...v2.1.2

v2.1.1

19 Feb 23:14
4ccae57

Choose a tag to compare

Added

  • Isolated client manager instances - Added OpencodeClientManager.createInstance() for creating standalone (non-singleton) client managers, enabling concurrent sessions pointing at different servers.
  • Client manager injection - Added clientManager option on OpencodeProviderSettings to use a custom client manager instead of the shared singleton.
  • Validation for conflicting options - Added warning when both clientManager and client are provided.

Changed

  • OpencodeClient type - Aligned OpencodeClient type alias to the SDK-exported OpencodeClient type directly instead of inferring from createOpencodeClient return type.

v2.1.0

18 Feb 23:54
000334a

Choose a tag to compare

What's New

Added

  • SDK client passthrough optionsclientOptions on OpencodeProviderSettings to forward createOpencodeClient() configuration (headers, auth, fetch, serializers, validators, transformers, throwOnError, and RequestInit-compatible options)
  • Preconfigured client supportclient on OpencodeProviderSettings to use a prebuilt OpenCode SDK client directly
  • New runnable exampleexamples/client-options.ts demonstrating both patterns

Changed

  • Client initialization behaviorclientOptions are now applied consistently across external baseUrl, existing-server, and auto-started-server client creation paths
  • Conflict handling — Reserved baseUrl and directory values in clientOptions are ignored with warnings; client takes precedence over clientOptions
  • Singleton safetyOpencodeClientManager now warns when options are silently dropped after initialization instead of failing silently

Fixed

  • Duplicate provider warnings — Validation warnings are no longer logged twice during provider creation

Full Changelog: v2.0.0...v2.1.0

v2.0.0 — OpenCode SDK v2 Migration

13 Feb 20:46
64deb75

Choose a tag to compare

What's Changed

Migrates the provider from OpenCode SDK v1 to v2 (@opencode-ai/sdk/v2), adding support for new v2 capabilities and hardening the implementation.

Core Migration

  • SDK v2 API style: All SDK calls migrated from nested { path, body } to v2 flat parameter style ({ sessionID, directory, ... })
  • Import path: @opencode-ai/sdk@opencode-ai/sdk/v2
  • Dependency bumps: @ai-sdk/provider ^3.0.8, @ai-sdk/provider-utils ^4.0.15, ai ^6.0.85

New Features

  • Permission/tool approval flow: Emits tool-approval-request stream parts from permission.asked events; handles tool-approval-response parts via permission.reply() API
  • Native structured output: JSON mode uses OpenCode's native json_schema format with retryCount instead of prompt-based instructions
  • File/source content streaming: handleFilePart() fully implemented — handles data URLs, HTTP URLs, and source metadata
  • New settings: permission (ruleset), variant, directory (per-request routing), outputFormatRetryCount
  • New error types: ContextOverflowError"length", StructuredOutputError"error"

Hardening

  • Question events: question.asked emits a stream error part with warning; related events are known no-ops
  • Known v2 events/parts: All new v2 event and part types explicitly handled (no "Unknown" debug spam)
  • Safe tool input serialization: safeStringifyToolInput() with try/catch fallback
  • Typed approval client: ApprovalClient interface replaces loose typing
  • File part diagnostics: Debug logs when file parts are skipped
  • Non-prefix tool input delta: Prevents data loss on non-incremental input changes

Deprecations

  • tools setting → use permission ruleset
  • cwd setting → use directory

Tests

  • 283 tests passing (up from 278)

Full Changelog: v1.0.0...v2.0.0

AI SDK v6 Compatibility (v1.0.0)

02 Jan 00:56
ad6e2bb

Choose a tag to compare

Overview

This is the first AI SDK v6–compatible release of ai-sdk-provider-opencode-sdk.

What’s new

  • Provider updated to AI SDK v6 interfaces (LanguageModelV3/ProviderV3).
  • Examples updated to v6 APIs and Output.object.
  • README now documents v5/v6 compatibility and npm tags.

Versioning & tags

  • latest → 1.x (AI SDK v6)
  • ai-sdk-v5 → 0.x (AI SDK v5)

v0.0.2 - Dependency Updates

11 Dec 00:44

Choose a tag to compare

Changes

Updated Dependencies

  • @ai-sdk/provider-utils: 3.0.9 → 3.0.18
  • @opencode-ai/sdk: ^1.0.141 → ^1.0.137 (aligned with stable release)

Fixed

  • OpenAI model names - Documentation now correctly references GPT-5.1 series models:
    • openai/gpt-5.1
    • openai/gpt-5.1-codex
    • openai/gpt-5.1-codex-mini
    • openai/gpt-5.1-codex-max

Installation

npm install ai-sdk-provider-opencode-sdk@0.0.2

See CHANGELOG.md for full details.

v0.0.1 - Initial Release

11 Dec 00:31

Choose a tag to compare

AI SDK Provider for OpenCode - Initial Release

This is the initial release of the AI SDK v5 provider for OpenCode via the @opencode-ai/sdk.

Features

Core Functionality

  • Text generation - generateText() and streamText() with real-time SSE streaming
  • Object generation - generateObject() and streamObject() with Zod schema validation
  • Multi-turn conversations - Session-based context management
  • Tool observation - Watch server-side tool execution (Read, Write, Bash, etc.)

Multi-Provider Support

  • Anthropic - Claude 4.5 series (opus, sonnet, haiku)
  • OpenAI - GPT-5.1 series (gpt-5.1, gpt-5.1-codex, gpt-5.1-codex-mini, gpt-5.1-codex-max)
  • Google - Gemini 2.0/2.5/3.0 series

Provider Features

  • Auto-start server management
  • Agent selection (build, plan, general, explore)
  • Custom system prompts
  • Session resume/management
  • AbortController support for cancellation
  • Base64 image input for vision models
  • Custom logging with verbose mode

Installation

npm install ai-sdk-provider-opencode-sdk ai @opencode-ai/sdk

Quick Start

import { generateText } from 'ai';
import { opencode } from 'ai-sdk-provider-opencode-sdk';

const result = await generateText({
  model: opencode('anthropic/claude-opus-4-5-20251101'),
  prompt: 'What is the capital of France?',
});

console.log(result.text);

Examples

The package includes comprehensive examples:

  • basic-usage.ts - Simple text generation
  • streaming.ts - Real-time streaming
  • generate-object.ts / stream-object.ts - Structured output
  • tool-observation.ts - Watching tool execution
  • image-input.ts - Vision model usage
  • abort-signal.ts - Request cancellation
  • custom-config.ts - Provider configuration
  • limitations.ts - What's NOT supported

Known Limitations

The following AI SDK parameters are ignored (OpenCode uses provider defaults):

  • temperature, topP, topK
  • maxOutputTokens
  • presencePenalty, frequencyPenalty
  • stopSequences, seed

Custom tool definitions are ignored - OpenCode executes tools server-side.

See CHANGELOG.md for full details.