You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using AsyncOpenAI.embeddings.create with the PostHog Python SDK, embedding requests are logged as openai_embedding events with properties like AI input (LLM) and AI Trace ID (LLM). However, these events do not appear in the LLM observability dashboard, which expects properties like llm_input, llm_output, and llm_type.
Expected behavior:
Embedding events should use the same property names as other LLM traces so they appear in LLM observability by default.
Current workaround:
Manually capturing a llm_trace event with the expected property names after each embedding request.