CVE-2025-3262: Transformers: ReDoS in chat.py causes CPU exhaustion
GHSA-489j-g2vx-39wf HIGH PoC AVAILABLE CISA: TRACK*HuggingFace Transformers 4.49.0 contains a ReDoS vulnerability in the chat CLI command that allows unauthenticated network attackers to exhaust CPU resources with a single crafted string. Upgrade to 4.51.0 immediately. Risk is bounded to deployments where the transformers chat interface accepts external untrusted input — assess your exposure before treating this as a fire drill.
Risk Assessment
Moderate operational risk despite CVSS 7.5. The very low EPSS score (0.00132) and narrow attack surface — specifically the chat CLI command parser — limit real-world exploitation probability significantly. The network-accessible, no-auth, no-user-interaction attack path is the most concerning factor; any deployment exposing the transformers chat interface to the internet is directly vulnerable. Impact is availability-only with no data exfiltration or code execution risk, but a successful attack achieves complete service disruption.
Affected Systems
| Package | Ecosystem | Vulnerable Range | Patched |
|---|---|---|---|
| transformers | pip | — | No patch |
| transformers | pip | >= 4.49.0, < 4.51.0 | 4.51.0 |
Severity & Risk
Attack Surface
Recommended Action
6 steps-
Upgrade transformers to 4.51.0 or later — patch commits 0720e206 and 126abe34 on GitHub.
-
Audit all AI/ML environments with
pip show transformersorpip list | grep transformersto identify affected versions. -
If immediate upgrade is blocked, apply input length limits and sanitize user-controlled strings before they reach the transformers chat interface.
-
Deploy CPU utilization alerts on transformer-serving processes — anomalous CPU spikes correlated with incoming requests are the primary detection signal.
-
Apply rate limiting on any public-facing endpoint using the transformers chat command.
-
Add transformers to your SCA/dependency scanning pipeline to catch future vulnerable versions at build time.
CISA SSVC Assessment
Source: CISA Vulnrichment (SSVC v2.0). Decision based on the CISA Coordinator decision tree.
Classification
Compliance Impact
This CVE is relevant to:
Frequently Asked Questions
What is CVE-2025-3262?
HuggingFace Transformers 4.49.0 contains a ReDoS vulnerability in the chat CLI command that allows unauthenticated network attackers to exhaust CPU resources with a single crafted string. Upgrade to 4.51.0 immediately. Risk is bounded to deployments where the transformers chat interface accepts external untrusted input — assess your exposure before treating this as a fire drill.
Is CVE-2025-3262 actively exploited?
Proof-of-concept exploit code is publicly available for CVE-2025-3262, increasing the risk of exploitation.
How to fix CVE-2025-3262?
1. Upgrade transformers to 4.51.0 or later — patch commits 0720e206 and 126abe34 on GitHub. 2. Audit all AI/ML environments with `pip show transformers` or `pip list | grep transformers` to identify affected versions. 3. If immediate upgrade is blocked, apply input length limits and sanitize user-controlled strings before they reach the transformers chat interface. 4. Deploy CPU utilization alerts on transformer-serving processes — anomalous CPU spikes correlated with incoming requests are the primary detection signal. 5. Apply rate limiting on any public-facing endpoint using the transformers chat command. 6. Add transformers to your SCA/dependency scanning pipeline to catch future vulnerable versions at build time.
What systems are affected by CVE-2025-3262?
This vulnerability affects the following AI/ML architecture patterns: LLM chat interfaces, model serving, chatbot deployments, agent frameworks.
What is the CVSS score for CVE-2025-3262?
CVE-2025-3262 has a CVSS v3.1 base score of 7.5 (HIGH). The EPSS exploitation probability is 0.32%.
Technical Details
NVD Description
A Regular Expression Denial of Service (ReDoS) vulnerability was discovered in the huggingface/transformers repository, specifically in version 4.49.0. The vulnerability is due to inefficient regular expression complexity in the `SETTING_RE` variable within the `transformers/commands/chat.py` file. The regex contains repetition groups and non-optimized quantifiers, leading to exponential backtracking when processing 'almost matching' payloads. This can degrade application performance and potentially result in a denial-of-service (DoS) when handling specially crafted input strings. The issue is fixed in version 4.51.0.
Exploitation Scenario
An adversary identifies a public-facing AI chatbot or assistant service powered by HuggingFace Transformers 4.49.0. They craft a string that almost matches the SETTING_RE regex pattern — using repetitive characters designed to trigger catastrophic backtracking in the regex engine. When submitted as a chat setting parameter, the Python regex engine enters exponential time complexity. The attacker dispatches a burst of such requests (trivially automated), pinning the service CPU at 100% within seconds and rendering it unavailable to legitimate users. No credentials, no prior AI/ML knowledge, and no access to source code are required — only awareness of the affected regex location, which is publicly documented in the Huntr advisory.
Weaknesses (CWE)
CVSS Vector
CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H References
- github.com/advisories/GHSA-489j-g2vx-39wf
- github.com/huggingface/transformers/commit/126abe3461762e5fc180e7e614391d1b4ab051ca
- nvd.nist.gov/vuln/detail/CVE-2025-3262
- github.com/huggingface/transformers/commit/0720e206c6ba28887e4d60ef60a6a089f6c1cc76 Patch
- huntr.com/bounties/ecf5ccc4-39e7-4fb3-b547-14a41d31a184 Exploit 3rd Party
Timeline
Related Vulnerabilities
CVE-2024-3568 9.6 HuggingFace Transformers: RCE via pickle deserialization
Same package: transformers CVE-2023-6730 8.8 HuggingFace Transformers: RCE via unsafe deserialization
Same package: transformers CVE-2024-11392 8.8 HuggingFace Transformers: RCE via config deserialization
Same package: transformers CVE-2024-11393 8.8 Transformers: RCE via MaskFormer model deserialization
Same package: transformers CVE-2024-11394 8.8 Transformers: RCE via Trax model deserialization
Same package: transformers
AI Threat Alert