Skip to content

[BugFix]add int8 cache dtype when using attention quantization #136

[BugFix]add int8 cache dtype when using attention quantization

[BugFix]add int8 cache dtype when using attention quantization #136

Triggered via pull request February 21, 2025 02:20
Status Failure
Total duration 14s
Artifacts

yapf.yml

on: pull_request
Matrix: yapf
Fit to window
Zoom out
Zoom in

Annotations

1 error
yapf (3.12)
Process completed with exit code 1.