use text-embeddings-inference get "413 Payload Too Large" #12263
Labels
👻 feat:rag
Embedding related issue, like qdrant, weaviate, milvus, vector database.
good first issue
Good first issue for newcomers
Self Checks
Dify version
0.14.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Use text-embeddings-inference for embedding service, but its single request context size is limited to 2MB. When the document uploaded to the knowledge base is larger than 2MB, this error will be encountered
Notes under text-embeddings-inference
✔️ Expected Behavior
Normally handle large files
❌ Actual Behavior
Error when processing large files
The text was updated successfully, but these errors were encountered: