We use these services and cookies to improve your user experience. You may opt out if you wish, however, this may limit some features on this site.

Please see our statement on Data Privacy.

Crisp.chat (Helpdesk and Chat)

Ok

THREATINT
PUBLISHED

CVE-2025-52566

llama.cpp tokenizer signed vs. unsigned heap overflow



Description

llama.cpp is an inference of several LLM models in C/C++. Prior to version b5721, there is a signed vs. unsigned integer overflow in llama.cpp's tokenizer implementation (llama_vocab::tokenize) (src/llama-vocab.cpp:3036) resulting in unintended behavior in tokens copying size comparison. Allowing heap-overflowing llama.cpp inferencing engine with carefully manipulated text input during tokenization process. This issue has been patched in version b5721.

Reserved 2025-06-18 | Published 2025-06-24 | Updated 2025-06-24 | Assigner GitHub_M


HIGH: 8.6CVSS:3.1/AV:L/AC:L/PR:N/UI:R/S:C/C:H/I:H/A:H

Problem types

CWE-119: Improper Restriction of Operations within the Bounds of a Memory Buffer

CWE-195: Signed to Unsigned Conversion Error

Product status

< b5721
affected

References

github.com/...ma.cpp/security/advisories/GHSA-7rxv-5jhh-j6xx

github.com/...ommit/dd6e6d0b6a4bbe3ebfc931d1eb14db2f2b1d70af

cve.org (CVE-2025-52566)

nvd.nist.gov (CVE-2025-52566)

Download JSON

Share this page
https://cve.threatint.eu/CVE/CVE-2025-52566

Support options

Helpdesk Chat, Email, Knowledgebase