Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Variable Parsing when in LLM Prompts #12241

Open
5 tasks done
Namec999 opened this issue Dec 30, 2024 · 2 comments
Open
5 tasks done

Variable Parsing when in LLM Prompts #12241

Namec999 opened this issue Dec 30, 2024 · 2 comments
Labels
🐞 bug Something isn't working

Comments

@Namec999
Copy link

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

0.14.2

Cloud or Self Hosted

Cloud, Self Hosted (Docker), Self Hosted (Source)

Steps to reproduce

when assigning a converastion variable {{#conversation.var1#}} into antother variable,

the use of this variable into any LLM Node Prompte will consider the inner variables as string and not parsing them into variables.

✔️ Expected Behavior

when a variable is used inside a string, stored in another variable or passed within the workflow. it must be parsed as a variabe a not as a string

❌ Actual Behavior

variables in text {{#conversation.testVar#}} or /Context are considered as string, in node LLM prompts

Copy link

dosubot bot commented Dec 30, 2024

To ensure that variables used inside a string in LLM prompts are parsed as variables and not as strings when using Dify version 0.14.2, you should use the '{{}}' format for variable interpolation. For example, a variable like '/name' should be written as '{{name}}' to be properly recognized and processed as a variable reference [1].

Additionally, Dify supports the use of the Jinja-2 template language for more complex prompts, which allows for lightweight data transformation and logical processing. You can also call out the variable insertion menu by typing / or { to insert special variable blocks or upstream node variables into the prompt as context content [2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

@dosubot dosubot bot added the 🐞 bug Something isn't working label Dec 30, 2024
@Namec999
Copy link
Author

i mean i have a variable named var_ONE : "here is the value {{conversation.key}}"

then in an LLM node, i use the variable var_ONE as a prompte.

this will not parse the variable {{conversation.key}} and get it's value, instead, it's just written then as a string

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant