Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates Eland install command in the vector search example #2599

Merged
merged 4 commits into from
Nov 24, 2023
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 4 additions & 7 deletions docs/en/stack/ml/nlp/ml-nlp-shared.asciidoc
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
tag::nlp-eland-clone-docker-build[]
You can use the {eland-docs}[Eland client] to install the {nlp} model. Eland
commands can be run in Docker. First, you need to clone the Eland repository
then create a Docker image of Eland:
You can use the {eland-docs}[Eland client] to install the {nlp} model or use the pre-build
szabosteve marked this conversation as resolved.
Show resolved Hide resolved
Docker image to run the Eland install model commands. Pull the latest image with:

[source,shell]
--------------------------------------------------
git clone [email protected]:elastic/eland.git
cd eland
docker build -t elastic/eland .
docker pull docker.elastic.co/eland/eland
--------------------------------------------------

After the script finishes, your Eland Docker client is ready to use.
After the pull completes, your Eland Docker client is ready to use.
end::nlp-eland-clone-docker-build[]

tag::nlp-requirements[]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,11 @@ consists of real questions from the Microsoft Bing search engine and human
generated answers for them. The example works with a sample of this data set,
uses a model to produce text embeddings, and then runs vector search on it.

You can find
https://github.com/elastic/elasticsearch-labs/blob/main/notebooks/integrations/hugging-face/loading-model-from-hugging-face.ipynb[this example as a Jupyter notebook]
using the Python client in the `elasticsearch-labs` repo.


[discrete]
[[ex-te-vs-requirements]]
== Requirements
Expand Down