Skip to content

Lora-module support needed for using adaptersΒ #119

@sven-knoblauch

Description

@sven-knoblauch

When trying to configure a lora adapter, the ENV vars for enabling lora and other settings are exposed (also in runpod UI), but there is no option for adding the actual lora-modules (paths to the lora adapters/huggingface link).

In src/engine.py in the class OpenAIvLLMEngine is the option for adding these lists (line 137, 145).

As far as i saw in the vllm github page, the list should be like that:
lora_modules: Optional[List[LoRAModulePath]]

class LoRAModulePath:
    name: str
    path: str
    base_model_name: Optional[str] = None

Without these lora-modules, all other lora settings seem to be useless.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions