Skip to content

Promptflow Tracing Error with Custom LLM Node #3778

Closed Answered by alebullensmithPS
alebullensmithPS asked this question in Q&A
Discussion options

You must be logged in to vote

openai==1.37.1
opentelemetry-api==1.26.0
opentelemetry-exporter-otlp-proto-common==1.26.0
opentelemetry-exporter-otlp-proto-http==1.26.0
opentelemetry-proto==1.26.0
opentelemetry-sdk==1.26.0
opentelemetry-semantic-conventions==0.47b0

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@ckittel
Comment options

@alebullensmithPS
Comment options

Answer selected by alebullensmithPS
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants