Skip to content

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #762

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #762

Triggered via pull request March 10, 2025 16:20
@horheynmhorheynm
synchronize #1233
attn_quant
Status Failure
Total duration 55m 31s
Artifacts

test-check-transformers.yaml

on: pull_request
detect-changes
7s
detect-changes
transformers-tests
26m 21s
transformers-tests
Fit to window
Zoom out
Zoom in

Annotations

4 errors
transformers-tests
Process completed with exit code 1.
transformers-tests
Process completed with exit code 1.
transformers-tests
Process completed with exit code 1.
transformers-tests
Process completed with exit code 1.