-
Notifications
You must be signed in to change notification settings - Fork 187
llama-index-server: bump new chat ui #575
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
WalkthroughThis change increments the Changes
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
llama-index-server/llama_index/server/chat_ui.py (1)
8-8
: Update CHANGELOG or documentation
Consider adding an entry toCHANGELOG.md
or the project’s documentation to note the UI version bump to0.1.5
.
📜 Review details
Configuration used: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
llama-index-server/llama_index/server/chat_ui.py
(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (1)
- GitHub Check: Unit Tests (windows-latest, 3.9)
🔇 Additional comments (2)
llama-index-server/llama_index/server/chat_ui.py (2)
8-8
: Correctly bumpedCHAT_UI_VERSION
The constant has been updated from"0.1.2"
to"0.1.5"
as intended.
8-8
:✅ Verification successful
Verify the new UI package is published
Ensure that the@llamaindex/server
package at version0.1.5
is available on npm and the download link resolves successfully.Run:
Expect HTTP status code
200
.
🏁 Script executed:
#!/bin/bash # Verify that the new UI tarball exists on npm registry curl -o /dev/null -s -w "%{http_code}\n" https://registry.npmjs.org/@llamaindex/server/-/server-0.1.5.tgzLength of output: 111
@llamaindex/server v0.1.5 is published
Verified that the tarball athttps://registry.npmjs.org/@llamaindex/server/-/server-0.1.5.tgz
returns HTTP 200.
Summary by CodeRabbit