File exceeds model‘s context length cannot be added because it exceeds the model's allowed context length.File size :32.8KB #8385
Replies: 1 comment 1 reply
-
|
1.A 7B model deployed with vLLM can accept train.py (a single file of 500 lines), but the Ollama-deployed Qwen3-Coder-30B model cannot. 2.The Ollama-deployed Qwen3-Coder-30B model can accept many folders (no upper limit found in current tests), but cannot accept a single file exceeding 300 lines. 4.For the model deployed with Ollama, what is the context length limit for a single file? What is the context length limit for a folder? 5.Below is my configuration file , continue version is 1.3.15 and ollama version is 0.11.7: |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
The model is the qwen3-30b-Q4 quantized model.I'm not sure whether it's a model issue or a configuration issue.but when a single file exceeds 500 lines, it cannot be used with @ chat or edit,So I'm seeking help from all experts here。Can anyone give me a solution?Thanks for you help!
Beta Was this translation helpful? Give feedback.
All reactions