[BugFix]add int8 cache dtype when using attention quantization #143
Annotations
8 errors
mypy (3.10):
vllm_ascend/worker.py#L112
Name "cache_config" is not defined [name-defined]
|
mypy (3.10)
Process completed with exit code 1.
|
mypy (3.12)
The job was canceled because "_3_10" failed.
|
mypy (3.12)
The operation was canceled.
|
mypy (3.11)
The job was canceled because "_3_10" failed.
|
mypy (3.11)
The operation was canceled.
|
mypy (3.9)
The job was canceled because "_3_10" failed.
|
mypy (3.9)
The operation was canceled.
|