Skip to content

Support for KV cache quantization for MLA Attention vLLM fakequant#714

Open
kinjalpatel27 wants to merge 4 commits intomainfrom
kinjal/vllm_mla
Open

Support for KV cache quantization for MLA Attention vLLM fakequant#714
kinjalpatel27 wants to merge 4 commits intomainfrom
kinjal/vllm_mla

Commits

Commits on Dec 18, 2025

Commits on Dec 20, 2025

Commits on Dec 22, 2025

Commits on Jan 1, 2026