Skip to content

Add ChatQnA docker-compose example on Intel Xeon using MariaDB Vector #1916

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

RazvanLiviuVarzaru
Copy link

Description

Add ChatQnA docker-compose example on Intel Xeon using MariaDB Vector

Issues

n/a

Type of change

  • New feature (non-breaking change which adds new functionality)

Dependencies

n/a

Tests

Set the HF API token environment variable and:

cd ChatQnA/tests
bash test_compose_mariadb_on_xeon.sh

@Copilot Copilot AI review requested due to automatic review settings May 7, 2025 13:03
Copy link

github-actions bot commented May 7, 2025

Dependency Review

✅ No vulnerabilities or license issues found.

Scanned Files

None

Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a new docker-compose example to deploy the ChatQnA application on Intel Xeon systems using MariaDB Vector. Key changes include the addition of a new YAML configuration for deploying multiple microservices and an accompanying README detailing the deployment and testing instructions.

Reviewed Changes

Copilot reviewed 2 out of 4 changed files in this pull request and generated 2 comments.

File Description
ChatQnA/docker_compose/intel/cpu/xeon/compose_mariadb.yaml New docker-compose configuration detailing service definitions, healthcheck commands, and environment variables for deploying ChatQnA with MariaDB Vector.
ChatQnA/docker_compose/intel/cpu/xeon/README_mariadb.md New documentation outlining the build process, configuration, and deployment instructions for the ChatQnA application on Intel Xeon.
Files not reviewed (2)
  • ChatQnA/docker_compose/intel/cpu/xeon/set_env_mariadb.sh: Language not supported
  • ChatQnA/tests/test_compose_mariadb_on_xeon.sh: Language not supported

Example on how to deploy the ChatBot on Intel Xeon by using MariaDB Server as a vectorstore.
- use MariaDB Server as the backend database. Minimum required version is 11.7
- use the OPEA_DATAPREP_MARIADBVECTOR component for dataprep microservice
- use the OPEA_RETRIEVER_MARIADBVECTOR component for retriever microservice

How to test
Set the HF API token environment variable and:
```
cd ChatQnA/tests
bash test_compose_mariadb_on_xeon.sh
```

Signed-off-by: Razvan-Liviu Varzaru <[email protected]>
@RazvanLiviuVarzaru RazvanLiviuVarzaru force-pushed the feature/mariadb-vector-xeon branch from e092a1d to b0cf593 Compare May 7, 2025 13:18
@@ -0,0 +1,25 @@
#!/usr/bin/env bash

# Copyright (C) 2025 MariaDB Foundation
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This also should be ok, but I'm just noting it for myself to be confirmed with other stakeholders.

Copy link
Collaborator

@lkk12014402 lkk12014402 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please resolve the comments

Copy link
Collaborator

@letonghan letonghan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @RazvanLiviuVarzaru for your contributions! Please check the comments below

Signed-off-by: Razvan-Liviu Varzaru <[email protected]>
@RazvanLiviuVarzaru
Copy link
Author

Thank you for the comments!
@ashahba / @letonghan , review comments addressed in: 1f84e0d

Copy link
Collaborator

@letonghan letonghan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants