Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt Caching by Anthropic Claude #7382

Closed
4 of 5 tasks
amatiytsiv opened this issue Aug 18, 2024 · 4 comments · May be fixed by #12164
Closed
4 of 5 tasks

Prompt Caching by Anthropic Claude #7382

amatiytsiv opened this issue Aug 18, 2024 · 4 comments · May be fixed by #12164

Comments

@amatiytsiv
Copy link

Self Checks

  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

1. Is this request related to a challenge you're experiencing? Tell me about your story.

anthropic announced support for caching it would be nice to have it in model config options: https://docs.anthropic.com/en/docs/build-with-claude/prompt-caching

2. Additional context or comments

No response

3. Can you help us with this feature?

  • I am interested in contributing to this feature.
@amatiytsiv amatiytsiv changed the title Prompt Caching by nthropic Claude Prompt Caching by Anthropic Claude Aug 18, 2024
@amatiytsiv
Copy link
Author

any plans on implementing this possibility into dify core and tools as by other tools I can see a significant profit from using these as money saving

@dnyg
Copy link

dnyg commented Sep 3, 2024

It is a bit of a complicated feature to implement, because cache is not set in model config options - but per prompt.

So we need a way through the UI to specify that a certain prompt should be cached - similar to how you can enable jinja templates with a slider.

I have made a working version that just uses specific tags to trigger caching, but I don't think Dify would be interested in this solution, as it's a poor UX solution

@amatiytsiv
Copy link
Author

amatiytsiv commented Sep 3, 2024

It is a bit of a complicated feature to implement, because cache is not set in model config options - but per prompt.

So we need a way through the UI to specify that a certain prompt should be cached - similar to how you can enable jinja templates with a slider.

I have made a working version that just uses specific tags to trigger caching, but I don't think Dify would be interested in this solution, as it's a poor UX solution

I believe conversational checkbox next to send button is way to go for now.
But most importantly it also has to be implemented as configurable variable in workflow and agents, it can serve as quick job done, avoiding or even preparing good RAG

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Oct 4, 2024
@crazywoola crazywoola removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Oct 16, 2024
Copy link

dosubot bot commented Nov 16, 2024

Hi, @amatiytsiv. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.

Issue Summary

  • Request to add prompt caching support for Anthropic Claude in model configuration.
  • Complexity arises as caching is set per prompt, not in model config options.
  • Discussion on implementing a UI solution, similar to enabling jinja templates.
  • You suggested a conversational checkbox next to the send button for configuration.

Next Steps

  • Is this issue still relevant to the latest version of the Dify repository? If so, please comment to keep the discussion open.
  • Otherwise, this issue will be automatically closed in 15 days.

Thank you for your understanding and contribution!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Nov 16, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Dec 1, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Dec 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants