Skip to content

Conversation

@airtonix
Copy link
Contributor

Summary

Implements model-aware prompt format selection that tries provider-specific model configurations before falling back to generic model formats, enabling flexible multi-model setups.

Changes

  • MessageModelIdAccountant service: Tracks the active LLM model per message/session by listening to chat.message events and cleaning up on message/session deletion
  • Hierarchical model format matching: Updated getModelFormat() to try {providerID}-{modelID} format first (e.g., "anthropic-claude-3-5-sonnet"), then fall back to {modelID} (e.g., "claude-3-5-sonnet")
  • Functional renderer pattern: Refactored all renderers from class-based to functional factory pattern, returning immutable render functions
  • Enhanced prompt renderer factory: createPromptRenderer() now manages renderer instances and returns a getFormatter() function for dynamic format selection
  • Comprehensive README documentation:
    • Comparison table of XML, JSON, and Markdown formats with use cases
    • Detailed explanation of hierarchical model matching with examples
    • Example configurations for multi-model setups
    • Project-specific configuration overrides

Testing

  • ✅ All 185 existing unit tests passing
  • ✅ New renderer tests added
  • ✅ MessageModelIdAccountant integrated and tested
  • ✅ ESLint and Prettier passing
  • ✅ TypeScript strict mode validation passing

Implementation Details

When a tool executes (skill_find, skill_use, skill_resource):

  1. MessageModelIdAccountant provides the active model's providerID and modelID
  2. getModelFormat() checks for a match in this priority order:
    • {providerID}-{modelID} (specific: e.g., "anthropic-claude-3-5-sonnet")
    • {modelID} (generic: e.g., "claude-3-5-sonnet")
    • Falls back to global promptRenderer default
  3. createPromptRenderer() returns the appropriate formatter function

This allows users to configure once at the generic model level (e.g., "gpt-4": "json") and have it work for all provider variants.

… format selection

Implement a provider pattern for prompt rendering that allows selecting between
JSON, XML (default), and Markdown formats. The system can be configured globally
or on a per-model basis via the bunfig configuration.

Key features:
- PromptRenderer interface with three implementations (JSON, XML, Markdown)
- Markdown renderer with nested heading/list structure and HTML escaping
- Model-aware format selection via getModelFormat() helper
- Dynamic renderer creation at tool execution time
- Backward compatible: defaults to XML format (current behavior)
- Configuration via bunfig with support for .opencode-skillful.json

Changes:
- Add bunfig dependency for typed config loading
- Create src/lib/PromptRenderer.ts interface and factory
- Create three renderer implementations for JSON, XML, and Markdown formats
- Create getModelFormat() helper for model-aware format selection
- Update src/types.ts to include promptRenderer config
- Update src/config.ts to load renderer config from bunfig
- Update src/api.ts to expose config in API object
- Update src/index.ts to use renderer.render() instead of jsonToXml()
- Update three tool injection points to use model-aware renderer
Add comprehensive test coverage for all renderer implementations:
- JsonPromptRenderer: JSON output, indentation, nested structures
- XmlPromptRenderer: XML formatting, special character escaping, root elements
- MdPromptRenderer: Markdown headings, list items, nesting, HTML escaping
- Factory function: renderer creation, format validation, error handling

Expand README with new Prompt Renderer Configuration section:
- Document all three supported formats (XML, JSON, Markdown)
- Provide configuration examples for .opencode-skillful.json
- Explain model-aware format selection with real examples
- Show output examples for each format

Changes:
- Create src/lib/__tests__/createPromptRenderer.test.ts
- Create src/lib/renderers/__tests__/JsonPromptRenderer.test.ts
- Create src/lib/renderers/__tests__/XmlPromptRenderer.test.ts
- Create src/lib/renderers/__tests__/MdPromptRenderer.test.ts
- Update README.md with Prompt Renderer Configuration section

Test Results:
- 185 tests pass (up from 147)
- Full coverage of renderer implementations
- All linting and build checks pass
… fallback pattern

Implements model-aware format selection that tries provider-specific format (e.g., "anthropic-claude-3-5-sonnet") before falling back to generic model format (e.g., "claude-3-5-sonnet"). Includes:

- MessageModelIdAccountant service to track active model per message/session
- Improved getModelFormat() to handle provider + modelId parameters
- Refactored renderers from class-based to functional pattern
- Enhanced README with detailed configuration examples for multi-model setups
Copilot AI review requested due to automatic review settings December 29, 2025 11:44
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds configurable prompt rendering with model-aware format selection, allowing different LLM models to receive skill data in their preferred formats (XML, JSON, or Markdown). The implementation uses a hierarchical fallback pattern that tries provider-specific model configurations before falling back to generic model formats.

Key Changes:

  • Added MessageModelIdAccountant service to track active models per message/session
  • Implemented three renderer types (XML, JSON, Markdown) with functional factory pattern
  • Added hierarchical model format matching with provider-specific and generic fallback
  • Integrated bunfig for configuration management with support for per-model renderer overrides

Reviewed changes

Copilot reviewed 21 out of 22 changed files in this pull request and generated 18 comments.

Show a summary per file
File Description
src/types.ts Added promptRenderer and modelRenderers configuration fields to PluginConfig type
src/services/MessageModelIdAccountant.ts New service to track model usage per message/session with cleanup handlers
src/lib/renderers/XmlPromptRenderer.ts XML renderer factory function using existing jsonToXml utility
src/lib/renderers/XmlPromptRenderer.test.ts Test suite for XML renderer (has critical bugs - uses class pattern instead of factory)
src/lib/renderers/JsonPromptRenderer.ts JSON renderer factory function with root element wrapping
src/lib/renderers/JsonPromptRenderer.test.ts Test suite for JSON renderer (has critical bugs - uses class pattern instead of factory)
src/lib/renderers/MdPromptRenderer.ts Markdown renderer factory with heading/list formatting (has bugs in null handling and newlines)
src/lib/renderers/MdPromptRenderer.test.ts Test suite for Markdown renderer (has critical bugs - uses class pattern instead of factory)
src/lib/getModelFormat.ts Hierarchical model format resolver with provider-specific and generic fallback
src/lib/createPromptRenderer.ts Factory that creates all renderers and returns formatter selection function
src/lib/createPromptRenderer.test.ts Test suite for renderer factory (incompatible with actual implementation)
src/lib/PromptRenderer.ts Interface definition for renderer implementations
src/index.ts Integration of model tracking and renderer selection in tool execution handlers
src/config.ts Updated to use bunfig for configuration loading with modelRenderers support
src/api.ts Minor documentation updates for renderer support
src/mocks.ts Updated mock config with new promptRenderer and modelRenderers fields
bunfig.config.ts New bunfig configuration schema (missing basePaths field)
package.json Added bunfig dependency
bun.lock Lock file updates for bunfig and dependencies
README.md Extensive documentation for prompt renderer configuration (examples don't match implementation)
.mise/tasks/setup Added git worktree check to skip hk install in worktrees

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copilot AI review requested due to automatic review settings December 29, 2025 23:18
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 23 out of 24 changed files in this pull request and generated 6 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copilot AI review requested due to automatic review settings December 30, 2025 00:02
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 24 out of 25 changed files in this pull request and generated 6 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@airtonix airtonix merged commit 0067a12 into main Dec 30, 2025
3 checks passed
@airtonix airtonix deleted the feat/prompt-renderer-config branch December 30, 2025 01:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants