-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Phoenix UI not showing VertexAI Token Count #1287
Comments
I'm also seeing the kwargs are getting passed into the completion. Which along with the completion and callback getting put to the log, it appears the miss is on the Arize UI side. |
Hey @TheMellyBee ! Thanks so much for reporting this! I believe this is on the instrumentation side so will be transferring it over to that repository. We will get this prioritized and looked at. Thanks for the thorough analysis! cc @nate-mar |
The issue you're experiencing with the Phoenix UI not displaying the token count for VertexAI, despite it being visible in the logs, might be related to how the Here are a few potential areas to investigate:
By focusing on these areas, you might be able to identify the root cause of the discrepancy between the logs and the UI display. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Describe the bug
UI isn't showing token count for VertexAI after building up the callback manager when using LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)
However, I can see them on the log on completion.
Code
Expected behavior
A clear and concise description of what you expected to happen.
Should show the tokens.
Screenshots
If applicable, add screenshots to help explain your problem.
Environment (please complete the following information):
Additional context
Add any other context about the problem here (e.x. a link to a colab)
VertexAIInstrumentor().instrument(tracer_provider=tracer_provider)
Will get the tokens, but it comes in a a different trace entirely (makes sense). But if they can it seems like this should.
The text was updated successfully, but these errors were encountered: