Skip to content

Conversation

@Pulkit0729
Copy link

This PR is a draft PR for #3240
PS: The test file was create by copilot, but i have testest is thoroughly.

@CLAassistant
Copy link

CLAassistant commented Nov 7, 2025

CLA assistant check
All committers have signed the CLA.

# but we print the results for visibility
print(f"Prewarm Test Results:")
print(f" Without prewarm: {ttft_no_prewarm:.3f}s")
print(f" With prewarm: {ttft_with_prewarm:.3f}s")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution! I hope the livekit team reviews this and gets it in. Looks like a must. To facilitate review, I would recommend:

  • run ruff
  • move into a method everything from chat_ctx = llm.ChatContext() to await llm_no_prewarm.aclose(). It is likely that this can be reused for test_llm_prewarm of other providers (gemini, etc)
  • move into a constant the LLM you're choosing. In this case: gpt-4o-mini

@Pulkit0729
Copy link
Author

@marctorsoc Thanks for the review, updated the PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants