Skip to content

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #762

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #762

detect-changes

succeeded Mar 10, 2025 in 7s