Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt Flow incomplete documentation #57

Open
mennolaan opened this issue Aug 28, 2024 · 1 comment
Open

Prompt Flow incomplete documentation #57

mennolaan opened this issue Aug 28, 2024 · 1 comment
Assignees

Comments

@mennolaan
Copy link

Hi!

Im trying to wrap my head around the prompt flow logic as I expected more logic in the openapi.json and policy.xml

The animated gif creates an assumption that we make a request to Apim to one api endpoint. Within api management this request gets send to prompt flow hosted in a container and than upon retrieval is send to openai endpoint with compressed prompt.

However the policy.xml is empty and the openapi.json only displays a score result. Also the descriptions says this " The Prompt Flow OpenAI connection will be facilitated by APIM, enabling load balancing, token counting, and other features. " So one would at least expect some aspects in the policy.xml.

Im mostly interested how the Prompt Flow hosted in a container related to api management.

But I have a feeling you're creating a prompt flow that does the compression and also passes the info to openai. Thus doing the load balancing within prompt flow?

Maybe my expectation is not correct, but it feels like this lab isn't complete.

@simonkurtz-MSFT
Copy link
Collaborator

Hi @mennolaan, thank you for submitting this issue! I'm assigning to @vieiraae to have a look as he is the originator of that lab and has the most context.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants