-
Notifications
You must be signed in to change notification settings - Fork 280
Add ChatQnA docker-compose example on AIPC using MariaDB Vector #1902
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Add ChatQnA docker-compose example on AIPC using MariaDB Vector #1902
Conversation
Example on how to deploy the ChatBot on AIPC by using MariaDB Server as a vectorstore. - use MariaDB Server as the backend database. Minimum required version is 11.7 - use the OPEA_DATAPREP_MARIADBVECTOR component for dataprep microservice - use the OPEA_RETRIEVER_MARIADBVECTOR component for retriever microservice How to test Set the HF API token environment variable and: ``` cd ChatQnA/tests bash test_compose_mariadb_on_aipc.sh ``` Signed-off-by: Razvan-Liviu Varzaru <[email protected]>
Dependency Review✅ No vulnerabilities or license issues found.Scanned FilesNone |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces a docker-compose example for deploying the ChatQnA service on AIPC using MariaDB Vector.
- Adds a new compose file (compose_mariadb.yaml) configuring multiple services including MariaDB, dataprep, embedding, retriever, UI, nginx, and ollama.
- Provides a README (README_mariadb.md) with detailed instructions and quick-start steps to deploy the service.
Reviewed Changes
Copilot reviewed 2 out of 4 changed files in this pull request and generated 1 comment.
File | Description |
---|---|
ChatQnA/docker_compose/intel/cpu/aipc/compose_mariadb.yaml | New docker-compose configuration that orchestrates several dependent services including the MariaDB container. |
ChatQnA/docker_compose/intel/cpu/aipc/README_mariadb.md | Documentation for setting up and deploying the ChatQnA service with MariaDB Vector using docker-compose. |
Files not reviewed (2)
- ChatQnA/docker_compose/intel/cpu/aipc/set_env_mariadb.sh: Language not supported
- ChatQnA/tests/test_compose_mariadb_on_aipc.sh: Language not supported
Comments suppressed due to low confidence (1)
ChatQnA/docker_compose/intel/cpu/aipc/README_mariadb.md:1
- [nitpick] Consider updating the README title to reflect that this example uses MariaDB Vector, aligning it with the PR title.
# Build Mega Service of ChatQnA on AIPC
for more information, see https://pre-commit.ci
@chensuyue, |
No, we don't have AIPC for CI. |
We have few test on AIPC, so we don't have specific resource for those test. Unless we plan to implement AIPC test for all the examples, I will try to apply for an AIPC for test. |
I've added a Xeon example here: #1916. As for this PR, feel free to close it, or if you find it valuable, you can merge it once an AIPC runner is available in CI. Thanks! |
Labeling this as |
Once the PR1654 merged, the related images in the dockerhub will be refreshed in one day or two. But in CI test, we always use images build from latest GenAIComps main branch. |
Description
Add ChatQnA docker-compose example on AIPC using MariaDB Vector.
Depends on: opea-project/GenAIComps#1645
Issues
n/a
Type of change
Dependencies
n/a
Tests
Set the HF API token environment variable and: