Skip to content

[codex] Add MCP access, Chinese-first output mode, and dedicated ingest model#152

Open
SipengXie2024 wants to merge 10 commits into
nashsu:mainfrom
SipengXie2024:codex/mcp-output-doc-models
Open

[codex] Add MCP access, Chinese-first output mode, and dedicated ingest model#152
SipengXie2024 wants to merge 10 commits into
nashsu:mainfrom
SipengXie2024:codex/mcp-output-doc-models

Conversation

@SipengXie2024
Copy link
Copy Markdown

Summary

This PR bundles four local commits that improve local agent workflows and day-to-day authoring in LLM Wiki.

  • add LLM Wiki MCP access and local agent settings
  • refactor chat retrieval into a shared helper and fix the sources panel layout
  • add a Chinese-first output mode that preserves necessary English terms
  • add a dedicated document-processing model config for ingest, merge, and Save to Wiki flows
  • ignore local Windows dev helper files

What changed

Local agent / MCP access

  • add the local MCP server entrypoint, bridge plumbing, tests, and settings UI
  • expose local agent access controls in Settings
  • keep the bridge gated so local agent access can be explicitly enabled

Chat and output behavior

  • move chat retrieval prompt construction into a shared helper
  • add a Chinese-first output mode for Chinese academic workflows that preserves English technical terms, titles, commands, code, and paths
  • update output-mode copy and project-creation defaults accordingly

Dedicated document-processing model

  • add a separate document-processing model config for ingest-related LLM work
  • keep normal chat, deep-research synthesis, lint, dedup, and review on the main model
  • route ingest queue, source import, clip ingest, Save to Wiki, page merge, and post-research auto-ingest through the document-processing model path
  • add readiness checks and tests for the dedicated ingest model flow

Local repo hygiene

  • ignore local Windows shortcut / launcher / tools artifacts used for local development

Validation

  • npm run typecheck
  • npx vitest run src/lib/document-llm.test.ts src/lib/has-usable-llm.test.ts src/lib/output-language.test.ts src/lib/language-metadata.test.ts

Notes

  • existing users keep current behavior by default because document processing still reuses the main LLM until explicitly changed in Settings
  • this PR intentionally does not split research/lint/dedup into separate model configs

@SipengXie2024 SipengXie2024 marked this pull request as ready for review May 11, 2026 10:29
…models

# Conflicts:
#	src-tauri/src/lib.rs
#	src/App.tsx
#	src/components/settings/settings-view.tsx
#	src/components/sources/sources-view.tsx
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant