Skip to content

Conversation

adyanthm
Copy link

What I Did

Referred to the leaked system prompts of tools like Cursor, Windsurf, Trae, and other agentic IDEs, and crafted a specialized system prompt. This new prompt is designed to make the model strictly adhere to both agentic-mode instructions and the user's intent, improving the overall developer experience.


What It Changes

The default system prompt isn't strict or directive enough for smaller, local, or non-agent-aware models. Such models often ask multiple clarifying questions (5–6 on average) before beginning the task.

This new prompt was designed and tuned specifically for:

  • OpenAI’s O series (e.g., GPT-4o)
  • Locally hosted models

With this change, the model is capable of initiating full project generation from a single, high-level prompt, rather than requesting further clarification.


Key Improvements

  • Enables models to actually modify files instead of only generating code blocks in responses.

  • Makes the system prompt more actionable, reducing friction for developers and enabling real-time coding experiences.


My Opinion

Compared to the default system prompt, this enhanced version is a major step forward — especially for locally hosted or prompt-sensitive models. While it still isn’t on par with Cursor or Windsurf’s prompt-engineering depth, it’s a substantial improvement.

👉 Please review and consider integrating this updated system prompt in the next release.

adyanthm added 2 commits July 18, 2025 16:33
## What I Did

Referred to the leaked system prompts of tools like Cursor, Windsurf, Trae, and other agentic IDEs, and crafted a specialized system prompt. This new prompt is designed to make the model strictly adhere to both agentic-mode instructions and the user's intent, improving the overall developer experience.

---

## What It Changes

The default system prompt isn't strict or directive enough for smaller, local, or non-agent-aware models. Such models often ask multiple clarifying questions (5–6 on average) before beginning the task.

This new prompt was designed and tuned specifically for:
- OpenAI’s `O` series (e.g., GPT-4o)
- Locally hosted models

With this change, the model is capable of initiating **full project generation** from a single, high-level prompt, rather than requesting further clarification.

---

## Key Improvements

- Enables models to **actually modify files** instead of only generating code blocks in responses.

- Makes the system prompt more actionable, reducing friction for developers and enabling real-time coding experiences.

---

## My Opinion

Compared to the default system prompt, this enhanced version is a major step forward — especially for locally hosted or prompt-sensitive models. While it still isn’t on par with Cursor or Windsurf’s prompt-engineering depth, it’s a substantial improvement.

👉 Please review and consider integrating this updated system prompt in the next release.
## What I Did

Referred to the leaked system prompts of tools like Cursor, Windsurf, Trae, and other agentic IDEs, and crafted a specialized system prompt. This new prompt is designed to make the model strictly adhere to both agentic-mode instructions and the user's intent, improving the overall developer experience.

---

## What It Changes

The default system prompt isn't strict or directive enough for smaller, local, or non-agent-aware models. Such models often ask multiple clarifying questions (5–6 on average) before beginning the task.

This new prompt was designed and tuned specifically for:
- OpenAI’s `O` series (e.g., GPT-4o)
- Locally hosted models

With this change, the model is capable of initiating **full project generation** from a single, high-level prompt, rather than requesting further clarification.

---

## Key Improvements

- Enables models to **actually modify files** instead of only generating code blocks in responses.

- Makes the system prompt more actionable, reducing friction for developers and enabling real-time coding experiences.

---

## My Opinion

Compared to the default system prompt, this enhanced version is a major step forward — especially for locally hosted or prompt-sensitive models. While it still isn’t on par with Cursor or Windsurf’s prompt-engineering depth, it’s a substantial improvement.

👉 Please review and consider integrating this updated system prompt in the next release.
@tkhwang
Copy link

tkhwang commented Jul 23, 2025

@adyanthm Can I ask one thing on this change ?

As I read PR, a lot of prompts are updated for agent, finally agentSystemMessage().

I could find chat_systemMessage() is used for chat mode, but couldn't find where agentSystemMessage() is called.
Where is agentSystemMessage() used ?

const systemMessage = chat_systemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, chatMode, mcpTools, includeXMLToolDefinitions })

	// system message
	private _generateChatMessagesSystemMessage = async (chatMode: ChatMode, specialToolFormat: 'openai-style' | 'anthropic-style' | 'gemini-style' | undefined) => {
...
		const systemMessage = chat_systemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, chatMode, mcpTools, includeXMLToolDefinitions })
		return systemMessage
	}

Should agentSystemMessage() be called in above line by checking it's mode agent ?

		const systemMessage = chatMode === 'agent'
			? agentSystemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, mcpTools, includeXMLToolDefinitions })
			: chat_systemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, chatMode, mcpTools, includeXMLToolDefinitions })

If I am wrong, please let me know.

@adyanthm
Copy link
Author

Oh yes sorry! I have mentioned that my prompt is specifically modded for agent. Its from my modded fork. I have only agent mode in my fork so i just set the system prompt without checking for agent mode. Can you fix it??

@tkhwang
Copy link

tkhwang commented Jul 23, 2025

It seems that I couldn't add commit in your PR and I just share the change.
As I mentioned, the following change is using updated prompt in agent mode, and still uses legacy prompt for other mode (chat and gather).

const systemMessage = chat_systemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, chatMode, mcpTools, includeXMLToolDefinitions })

// before
		const systemMessage = chat_systemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, chatMode, mcpTools, includeXMLToolDefinitions })
// after
import { ..., agentSystemMessage } from "../common/prompt/prompts.js";
...
		const systemMessage = chatMode === 'agent'
			? agentSystemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, mcpTools, includeXMLToolDefinitions })
			: chat_systemMessage({ workspaceFolders, openedURIs, directoryStr, activeURI, persistentTerminalIDs, chatMode, mcpTools, includeXMLToolDefinitions })

You can check systemMessage using console.log(systemMessage) in chat, gather and agent mode.

@adyanthm
Copy link
Author

I am busy with another project at moment. I gave you collab access to my forked repo. Please commit the changes there if you can.

- import `agentSystemMessage` from prompts.js
- conditionally set systemMessage based on 'agent' chat mode
@tkhwang
Copy link

tkhwang commented Jul 24, 2025

applied it.

@adyanthm
Copy link
Author

@andrewpareles Look into this and merge this asap.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants