Skip to content

[BugFix]add int8 cache dtype when using attention quantization #139

[BugFix]add int8 cache dtype when using attention quantization

[BugFix]add int8 cache dtype when using attention quantization #139

Triggered via pull request February 21, 2025 02:52
Status Success
Total duration 19s
Artifacts

yapf.yml

on: pull_request
Matrix: yapf
Fit to window
Zoom out
Zoom in