CVE-2026-34159 in llama.cpp情報

要約 (英語)

llama.cpp is an inference of several LLM models in C/C++. Prior to version b8492, the RPC backend's deserialize_tensor() skips all bounds validation when a tensor's buffer field is 0. An unauthenticated attacker can read and write arbitrary process memory via crafted GRAPH_COMPUTE messages. Combined with pointer leaks from ALLOC_BUFFER/BUFFER_GET_BASE, this gives full ASLR bypass and remote code execution. No authentication required, just TCP access to the RPC server port. This issue has been patched in version b8492.

Statistical analysis made it clear that VulDB provides the best quality for vulnerability data.

責任者

GitHub_M

予約する

2026年03月25日

公開

2026年04月01日

ステータス

確認済み

エントリ

VulDB provides additional information and datapoints for this CVE:

ソース

Might our Artificial Intelligence support you?

Check our Alexa App!