CVE-2023-2800: Transformers: temp file race condition allows local DoS
GHSA-282v-666c-3fvg MEDIUM PoC AVAILABLE CISA: TRACK*Low-urgency issue for most environments: exploitation requires local access to the same machine running transformers, plus precise timing. Upgrade to transformers>=4.30.0 as routine maintenance. Priority is elevated only if you run multi-tenant shared GPU clusters or CI/CD pipelines where untrusted users share compute with model training jobs.
Risk Assessment
MEDIUM with low practical urgency. CVSS 4.7 reflects the local-only attack vector (AV:L) and high complexity (AC:H) required to win the race condition. EPSS 0.00021 signals near-zero observed exploitation in the wild. Impact is confined to availability (C:N/I:N/A:H), meaning an attacker disrupts training/inference jobs but cannot exfiltrate models or data. Real risk surfaces primarily in shared HPC or Kubernetes GPU environments where multiple users share the same node filesystem.
Affected Systems
| Package | Ecosystem | Vulnerable Range | Patched |
|---|---|---|---|
| transformers | pip | — | No patch |
| transformers | pip | < 4.30.0 | 4.30.0 |
Severity & Risk
Attack Surface
Recommended Action
4 steps-
PATCH
Upgrade transformers to >=4.30.0 immediately via
pip install --upgrade transformers. Patch commit: 80ca92470938bbcc348e2d9cf4734c7c25cb1c43. -
WORKAROUND (if upgrade blocked): Set TMPDIR to a directory with strict 0700 permissions owned by the service account, eliminating cross-user temp file access.
-
DETECTION
Monitor for unusual file creation patterns in /tmp during model load operations; alert on SIGABRT or IOError exceptions in transformers processes on shared nodes.
-
ARCHITECTURE
Run training jobs in isolated containers (one user per pod) to eliminate the local access requirement entirely.
CISA SSVC Assessment
Source: CISA Vulnrichment (SSVC v2.0). Decision based on the CISA Coordinator decision tree.
Classification
Compliance Impact
This CVE is relevant to:
Frequently Asked Questions
What is CVE-2023-2800?
Low-urgency issue for most environments: exploitation requires local access to the same machine running transformers, plus precise timing. Upgrade to transformers>=4.30.0 as routine maintenance. Priority is elevated only if you run multi-tenant shared GPU clusters or CI/CD pipelines where untrusted users share compute with model training jobs.
Is CVE-2023-2800 actively exploited?
Proof-of-concept exploit code is publicly available for CVE-2023-2800, increasing the risk of exploitation.
How to fix CVE-2023-2800?
1. PATCH: Upgrade transformers to >=4.30.0 immediately via `pip install --upgrade transformers`. Patch commit: 80ca92470938bbcc348e2d9cf4734c7c25cb1c43. 2. WORKAROUND (if upgrade blocked): Set TMPDIR to a directory with strict 0700 permissions owned by the service account, eliminating cross-user temp file access. 3. DETECTION: Monitor for unusual file creation patterns in /tmp during model load operations; alert on SIGABRT or IOError exceptions in transformers processes on shared nodes. 4. ARCHITECTURE: Run training jobs in isolated containers (one user per pod) to eliminate the local access requirement entirely.
What systems are affected by CVE-2023-2800?
This vulnerability affects the following AI/ML architecture patterns: training pipelines, model serving, MLOps pipelines, CI/CD model evaluation.
What is the CVSS score for CVE-2023-2800?
CVE-2023-2800 has a CVSS v3.1 base score of 4.7 (MEDIUM). The EPSS exploitation probability is 0.02%.
Technical Details
NVD Description
Insecure Temporary File in GitHub repository huggingface/transformers prior to 4.30.0.
Exploitation Scenario
An adversary with a low-privilege shell account on a shared GPU cluster waits for a co-tenant's transformers-based training job to start. During model download or tokenizer caching, transformers creates a predictable temporary file path. The adversary pre-creates a symlink at that path pointing to a critical system file or simply races to modify/delete the temp file mid-operation (TOCTOU). This causes the training process to crash with an IOError or PermissionError, achieving denial-of-service against the victim's compute job — wasting GPU time and potentially corrupting incomplete checkpoint files if the crash occurs mid-save.
Weaknesses (CWE)
CVSS Vector
CVSS:3.1/AV:L/AC:H/PR:L/UI:N/S:U/C:N/I:N/A:H References
- github.com/advisories/GHSA-282v-666c-3fvg
- github.com/huggingface/transformers/pull/23372
- github.com/pypa/advisory-database/tree/main/vulns/transformers/PYSEC-2023-299.yaml
- nvd.nist.gov/vuln/detail/CVE-2023-2800
- github.com/huggingface/transformers/commit/80ca92470938bbcc348e2d9cf4734c7c25cb1c43 Patch
- huntr.dev/bounties/a3867b4e-6701-4418-8c20-3c6e7084a44a Exploit Patch 3rd Party
Timeline
Related Vulnerabilities
CVE-2024-3568 9.6 HuggingFace Transformers: RCE via pickle deserialization
Same package: transformers CVE-2023-6730 8.8 HuggingFace Transformers: RCE via unsafe deserialization
Same package: transformers CVE-2024-11392 8.8 HuggingFace Transformers: RCE via config deserialization
Same package: transformers CVE-2024-11393 8.8 Transformers: RCE via MaskFormer model deserialization
Same package: transformers CVE-2024-11394 8.8 Transformers: RCE via Trax model deserialization
Same package: transformers
AI Threat Alert