Skip to content

Conversation

@mcandeia
Copy link
Contributor

@mcandeia mcandeia commented Jan 9, 2026

Summary by cubic

Adds the Deco LLM app and binding powered by OpenRouter, with usage-aware billing via Deco Wallet (pre-authorization and commit). Enables metered LLM streaming and generation with centralized registry types.

  • New Features

    • New deco-llm MCP app and server with LLM binding (list, get, metadata, stream, generate).
    • Usage hooks integrate Wallet: pre-authorize based on model pricing, commit actual cost in microdollars.
    • Shared registry exports consolidated and extended with Wallet contract definitions.
    • OpenRouter improvements: env fallback for OPENROUTER_API_KEY, new exports (tools, types, hooks), and tools now accept env and usage hooks.
  • Migration

    • Provide OPENROUTER_API_KEY or send Authorization in requests.
    • Grant WALLET scopes: PRE_AUTHORIZE_AMOUNT and COMMIT_PRE_AUTHORIZED_AMOUNT.
    • Update imports to @decocms/openrouter and shared/registry.
    • Add deco-llm to the workspace and publish the app if you use the registry.

Written for commit 336454a. Summary will update on new commits.

Signed-off-by: Marcos Candeia <[email protected]>
Signed-off-by: Marcos Candeia <[email protected]>
@mcandeia mcandeia merged commit 5b35514 into main Jan 9, 2026
13 of 19 checks passed
@github-actions
Copy link

github-actions bot commented Jan 9, 2026

🚀 Preview Deployments Ready!

Your changes have been deployed to preview environments:

📦 content-scraper

🔗 View Preview

📦 deco-llm

🔗 View Preview

📦 google-calendar

🔗 View Preview

📦 google-tag-manager

🔗 View Preview

📦 mcp-studio

🔗 View Preview

📦 meta-ads

🔗 View Preview

📦 openrouter

🔗 View Preview

📦 registry

🔗 View Preview

These previews will be automatically updated with new commits to this PR.


Deployed from commit: b9f3835

Copy link

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

7 issues found across 24 files

Prompt for AI agents (all issues)

Check if these issues are valid — if so, understand the root cause of each and fix them.


<file name="deco-llm/server/main.ts">

<violation number="1" location="deco-llm/server/main.ts:77">
P2: `process.env` is not available in Cloudflare Workers environment. The `WALLET_VENDOR_ID` environment variable will always be undefined, causing the code to always fall back to `"deco"`. Consider accessing environment variables through the Deco runtime's `env` parameter instead.</violation>
</file>

<file name="shared/registry.ts">

<violation number="1" location="shared/registry.ts:41">
P1: `z.object({})` only validates empty objects and strips unknown properties. If arbitrary objects are expected (as the TypeScript types suggest), use `z.record(z.unknown())` or `z.object({}).passthrough()` to preserve the object contents.</violation>

<violation number="2" location="shared/registry.ts:47">
P1: `z.object({})` will strip all properties from the input schema. Use `z.record(z.unknown())` to preserve arbitrary object contents.</violation>

<violation number="3" location="shared/registry.ts:48">
P1: `z.object({})` will strip all properties from the output schema. Use `z.record(z.unknown())` to preserve arbitrary object contents.</violation>
</file>

<file name="deco-llm/server/usage.ts">

<violation number="1" location="deco-llm/server/usage.ts:25">
P1: Unit mismatch: `maxContextLength` is calculated as character count (via `JSON.stringify().length`), but it's multiplied by a per-token price. Characters and tokens are different units. Consider using a tokenizer to estimate token count, or apply a character-to-token ratio approximation (e.g., divide by 4 for rough estimate).</violation>
</file>

<file name="openrouter/server/lib/env.ts">

<violation number="1" location="openrouter/server/lib/env.ts:5">
P2: Error message is misleading after adding env var fallback. Users might not realize they can set `OPENROUTER_API_KEY` as an alternative to providing an authorization header.</violation>
</file>

<file name="openrouter/server/tools/llm-binding.ts">

<violation number="1" location="openrouter/server/tools/llm-binding.ts:508">
P1: Floating promise without error handling. If the `usage` promise rejects (e.g., stream cancelled) or `hook?.end?()` throws, this will cause an unhandled promise rejection. Consider adding a `.catch()` handler.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

name: z.string(),
description: z.string(),
inputSchema: z.object({}),
outputSchema: z.object({}),
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: z.object({}) will strip all properties from the output schema. Use z.record(z.unknown()) to preserve arbitrary object contents.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At shared/registry.ts, line 48:

<comment>`z.object({})` will strip all properties from the output schema. Use `z.record(z.unknown())` to preserve arbitrary object contents.</comment>

<file context>
@@ -0,0 +1,99 @@
+      name: z.string(),
+      description: z.string(),
+      inputSchema: z.object({}),
+      outputSchema: z.object({}),
+    }),
+  ),
</file context>
Suggested change
outputSchema: z.object({}),
outputSchema: z.record(z.unknown()),
Fix with Cubic

z.object({
name: z.string(),
description: z.string(),
inputSchema: z.object({}),
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: z.object({}) will strip all properties from the input schema. Use z.record(z.unknown()) to preserve arbitrary object contents.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At shared/registry.ts, line 47:

<comment>`z.object({})` will strip all properties from the input schema. Use `z.record(z.unknown())` to preserve arbitrary object contents.</comment>

<file context>
@@ -0,0 +1,99 @@
+    z.object({
+      name: z.string(),
+      description: z.string(),
+      inputSchema: z.object({}),
+      outputSchema: z.object({}),
+    }),
</file context>
Suggested change
inputSchema: z.object({}),
inputSchema: z.record(z.unknown()),
Fix with Cubic

}>;
};
const ConnectionSchema = z.object({
configuration_state: z.object({}),
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: z.object({}) only validates empty objects and strips unknown properties. If arbitrary objects are expected (as the TypeScript types suggest), use z.record(z.unknown()) or z.object({}).passthrough() to preserve the object contents.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At shared/registry.ts, line 41:

<comment>`z.object({})` only validates empty objects and strips unknown properties. If arbitrary objects are expected (as the TypeScript types suggest), use `z.record(z.unknown())` or `z.object({}).passthrough()` to preserve the object contents.</comment>

<file context>
@@ -0,0 +1,99 @@
+  }>;
+};
+const ConnectionSchema = z.object({
+  configuration_state: z.object({}),
+  configuration_scopes: z.array(z.string()),
+  tools: z.array(
</file context>
Suggested change
configuration_state: z.object({}),
configuration_state: z.record(z.unknown()),
Fix with Cubic

params: z.infer<typeof LanguageModelInputSchema>,
): string => {
const maxContextLength = Math.min(
JSON.stringify({
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Unit mismatch: maxContextLength is calculated as character count (via JSON.stringify().length), but it's multiplied by a per-token price. Characters and tokens are different units. Consider using a tokenizer to estimate token count, or apply a character-to-token ratio approximation (e.g., divide by 4 for rough estimate).

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At deco-llm/server/usage.ts, line 25:

<comment>Unit mismatch: `maxContextLength` is calculated as character count (via `JSON.stringify().length`), but it's multiplied by a per-token price. Characters and tokens are different units. Consider using a tokenizer to estimate token count, or apply a character-to-token ratio approximation (e.g., divide by 4 for rough estimate).</comment>

<file context>
@@ -0,0 +1,44 @@
+  params: z.infer<typeof LanguageModelInputSchema>,
+): string => {
+  const maxContextLength = Math.min(
+    JSON.stringify({
+      ...params.callOptions.prompt,
+      ...params.callOptions.tools,
</file context>
Fix with Cubic

const [usage, stream] = getUsageFromStream(
callResponse.stream as ReadableStream<LanguageModelV2StreamPart>,
);
usage.then((usage) => hook?.end?.(usage));
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Floating promise without error handling. If the usage promise rejects (e.g., stream cancelled) or hook?.end?() throws, this will cause an unhandled promise rejection. Consider adding a .catch() handler.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At openrouter/server/tools/llm-binding.ts, line 508:

<comment>Floating promise without error handling. If the `usage` promise rejects (e.g., stream cancelled) or `hook?.end?()` throws, this will cause an unhandled promise rejection. Consider adding a `.catch()` handler.</comment>

<file context>
@@ -493,13 +494,18 @@ export const createLLMStreamTool = (env: Env) =>
+        const [usage, stream] = getUsageFromStream(
           callResponse.stream as ReadableStream<LanguageModelV2StreamPart>,
         );
+        usage.then((usage) => hook?.end?.(usage));
         const response = streamToResponse(stream);
 
</file context>
Suggested change
usage.then((usage) => hook?.end?.(usage));
usage.then((usage) => hook?.end?.(usage)).catch(() => {});
Fix with Cubic

if (!isOpenRouterUsageReport(usage)) {
throw new Error("Usage cost not found");
}
const vendorId = process.env.WALLET_VENDOR_ID ?? "deco";
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: process.env is not available in Cloudflare Workers environment. The WALLET_VENDOR_ID environment variable will always be undefined, causing the code to always fall back to "deco". Consider accessing environment variables through the Deco runtime's env parameter instead.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At deco-llm/server/main.ts, line 77:

<comment>`process.env` is not available in Cloudflare Workers environment. The `WALLET_VENDOR_ID` environment variable will always be undefined, causing the code to always fall back to `"deco"`. Consider accessing environment variables through the Deco runtime's `env` parameter instead.</comment>

<file context>
@@ -0,0 +1,104 @@
+            if (!isOpenRouterUsageReport(usage)) {
+              throw new Error("Usage cost not found");
+            }
+            const vendorId = process.env.WALLET_VENDOR_ID ?? "deco";
+            await env.MESH_REQUEST_CONTEXT.state.WALLET.COMMIT_PRE_AUTHORIZED_AMOUNT(
+              {
</file context>
Fix with Cubic

export const getOpenRouterApiKey = (env: Env) => {
const authorization = env.MESH_REQUEST_CONTEXT.authorization;
const authorization =
env.MESH_REQUEST_CONTEXT.authorization ?? process.env.OPENROUTER_API_KEY;
Copy link

@cubic-dev-ai cubic-dev-ai bot Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Error message is misleading after adding env var fallback. Users might not realize they can set OPENROUTER_API_KEY as an alternative to providing an authorization header.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At openrouter/server/lib/env.ts, line 5:

<comment>Error message is misleading after adding env var fallback. Users might not realize they can set `OPENROUTER_API_KEY` as an alternative to providing an authorization header.</comment>

<file context>
@@ -1,7 +1,8 @@
 export const getOpenRouterApiKey = (env: Env) => {
-  const authorization = env.MESH_REQUEST_CONTEXT.authorization;
+  const authorization =
+    env.MESH_REQUEST_CONTEXT.authorization ?? process.env.OPENROUTER_API_KEY;
   if (!authorization) {
     throw new Error("Authorization header is required");
</file context>
Fix with Cubic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants