Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Obs AI Assistant] Error shown in knowledge base at startup #205333

Open
viduni94 opened this issue Dec 31, 2024 · 6 comments
Open

[Obs AI Assistant] Error shown in knowledge base at startup #205333

viduni94 opened this issue Dec 31, 2024 · 6 comments
Labels
bug Fixes for quality problems that affect the customer experience Team:Obs AI Assistant Observability AI Assistant

Comments

@viduni94
Copy link
Contributor

An error is shown at the start on AI Assistant with regards to the knowledge base and shows the Inspect issues link. When Inspect issues is clicked, the error says that the model is not deployed.

  • Initially we shouldn't show this error because at startup the only error is about the inference endpoint because the knowledge base is not installed already
"resource_not_found_exception
	Root causes:
		resource_not_found_exception: Inference endpoint not found [obs_ai_assistant_kb_inference]"
  • In the shown error modelId is not populated (refer screenshot)

Steps to reproduce:

  1. Start ES and Kibana
  2. Open the AI Assistant flyout
  3. The error will be shown within Inspect issues

Expected behavior:

  • There shouldn't be an error at startup
  • Wherever this error is relevant and shown because the model is not deployed, the correct modelId should be populated.

Screenshots (if relevant):

Image
@viduni94 viduni94 added bug Fixes for quality problems that affect the customer experience Team:Obs AI Assistant Observability AI Assistant labels Dec 31, 2024
@elasticmachine
Copy link
Contributor

Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant)

@neptunian
Copy link
Contributor

neptunian commented Jan 10, 2025

#205970 will not address this.

I don't know that this is not intentional. I believe this only happens when you have a pre configured connector. I guess it's not really an error, just giving you some info that your model is not deployed. Not the best UX, but doesn't seem harmful either. I might be wrong but I think the main user journey is to setup a connector first in which case we auto install the knowledge base and they won't see this.

CC @teknogeek0

@viduni94
Copy link
Contributor Author

I think the main user journey is to setup a connector first in which case we auto install the knowledge base and they won't see this.

Sounds good, thanks for the info @neptunian

@viduni94
Copy link
Contributor Author

Once the PR for #205970 is merged, I'll review this issue and see whether we would need any changes to improve the UX here.

@sorenlouv
Copy link
Member

@viduni94 One more thing you might want to take a look at, is how we handle timeouts. I've seen errors when the /setup endpoint takes more than 2 minutes to respond. I think this happens when there are no ML nodes or when we need to scale from 0. It might be hard to reproduce locally though.

@viduni94
Copy link
Contributor Author

@viduni94 One more thing you might want to take a look at, is how we handle timeouts. I've seen errors when the /setup endpoint takes more than 2 minutes to respond. I think this happens when there are no ML nodes or when we need to scale from 0. It might be hard to reproduce locally though.

Sure @sorenlouv
I'll take a look at it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Team:Obs AI Assistant Observability AI Assistant
Projects
None yet
Development

No branches or pull requests

4 participants