generated from runpod-workers/worker-template
-
Notifications
You must be signed in to change notification settings - Fork 230
Closed
Description
When trying to configure a lora adapter, the ENV vars for enabling lora and other settings are exposed (also in runpod UI), but there is no option for adding the actual lora-modules (paths to the lora adapters/huggingface link).
In src/engine.py in the class OpenAIvLLMEngine is the option for adding these lists (line 137, 145).
As far as i saw in the vllm github page, the list should be like that:
lora_modules: Optional[List[LoRAModulePath]]
class LoRAModulePath:
name: str
path: str
base_model_name: Optional[str] = None
Without these lora-modules, all other lora settings seem to be useless.
dumbPy
Metadata
Metadata
Assignees
Labels
No labels