CVE-2023-2800: Transformers: temp file race condition allows local DoS

GHSA-282v-666c-3fvg MEDIUM PoC AVAILABLE CISA: TRACK*
Published May 18, 2023
CISO Take

Low-urgency issue for most environments: exploitation requires local access to the same machine running transformers, plus precise timing. Upgrade to transformers>=4.30.0 as routine maintenance. Priority is elevated only if you run multi-tenant shared GPU clusters or CI/CD pipelines where untrusted users share compute with model training jobs.

Risk Assessment

MEDIUM with low practical urgency. CVSS 4.7 reflects the local-only attack vector (AV:L) and high complexity (AC:H) required to win the race condition. EPSS 0.00021 signals near-zero observed exploitation in the wild. Impact is confined to availability (C:N/I:N/A:H), meaning an attacker disrupts training/inference jobs but cannot exfiltrate models or data. Real risk surfaces primarily in shared HPC or Kubernetes GPU environments where multiple users share the same node filesystem.

Affected Systems

Package Ecosystem Vulnerable Range Patched
transformers pip No patch
160.2K OpenSSF 4.9 7.8K dependents Pushed 6d ago 39% patched ~101d to patch Full package profile →
transformers pip < 4.30.0 4.30.0
160.2K OpenSSF 4.9 7.8K dependents Pushed 6d ago 39% patched ~101d to patch Full package profile →

Severity & Risk

CVSS 3.1
4.7 / 10
EPSS
0.0%
chance of exploitation in 30 days
Higher than 6% of all CVEs
Exploitation Status
Exploit Available
Exploitation: MEDIUM
Sophistication
Moderate
Exploitation Confidence
medium
CISA SSVC: Public PoC
Public PoC indexed (trickest/cve)
Composite signal derived from CISA KEV, CISA SSVC, EPSS, trickest/cve, and Nuclei templates.

Attack Surface

AV AC PR UI S C I A
AV Local
AC High
PR Low
UI None
S Unchanged
C None
I None
A High

Recommended Action

4 steps
  1. PATCH

    Upgrade transformers to >=4.30.0 immediately via pip install --upgrade transformers. Patch commit: 80ca92470938bbcc348e2d9cf4734c7c25cb1c43.

  2. WORKAROUND (if upgrade blocked): Set TMPDIR to a directory with strict 0700 permissions owned by the service account, eliminating cross-user temp file access.

  3. DETECTION

    Monitor for unusual file creation patterns in /tmp during model load operations; alert on SIGABRT or IOError exceptions in transformers processes on shared nodes.

  4. ARCHITECTURE

    Run training jobs in isolated containers (one user per pod) to eliminate the local access requirement entirely.

CISA SSVC Assessment

Decision Track*
Exploitation poc
Automatable No
Technical Impact partial

Source: CISA Vulnrichment (SSVC v2.0). Decision based on the CISA Coordinator decision tree.

Classification

Compliance Impact

This CVE is relevant to:

EU AI Act
Art.17 - Quality management system — technical robustness and reliability
ISO 42001
A.6.1.4 - AI system lifecycle — dependency and supply chain security
NIST AI RMF
GOVERN-6.1 - Policies for AI risk identification and prioritization MANAGE-2.2 - Mechanisms are in place to sustain AI risk management

Frequently Asked Questions

What is CVE-2023-2800?

Low-urgency issue for most environments: exploitation requires local access to the same machine running transformers, plus precise timing. Upgrade to transformers>=4.30.0 as routine maintenance. Priority is elevated only if you run multi-tenant shared GPU clusters or CI/CD pipelines where untrusted users share compute with model training jobs.

Is CVE-2023-2800 actively exploited?

Proof-of-concept exploit code is publicly available for CVE-2023-2800, increasing the risk of exploitation.

How to fix CVE-2023-2800?

1. PATCH: Upgrade transformers to >=4.30.0 immediately via `pip install --upgrade transformers`. Patch commit: 80ca92470938bbcc348e2d9cf4734c7c25cb1c43. 2. WORKAROUND (if upgrade blocked): Set TMPDIR to a directory with strict 0700 permissions owned by the service account, eliminating cross-user temp file access. 3. DETECTION: Monitor for unusual file creation patterns in /tmp during model load operations; alert on SIGABRT or IOError exceptions in transformers processes on shared nodes. 4. ARCHITECTURE: Run training jobs in isolated containers (one user per pod) to eliminate the local access requirement entirely.

What systems are affected by CVE-2023-2800?

This vulnerability affects the following AI/ML architecture patterns: training pipelines, model serving, MLOps pipelines, CI/CD model evaluation.

What is the CVSS score for CVE-2023-2800?

CVE-2023-2800 has a CVSS v3.1 base score of 4.7 (MEDIUM). The EPSS exploitation probability is 0.02%.

Technical Details

NVD Description

Insecure Temporary File in GitHub repository huggingface/transformers prior to 4.30.0.

Exploitation Scenario

An adversary with a low-privilege shell account on a shared GPU cluster waits for a co-tenant's transformers-based training job to start. During model download or tokenizer caching, transformers creates a predictable temporary file path. The adversary pre-creates a symlink at that path pointing to a critical system file or simply races to modify/delete the temp file mid-operation (TOCTOU). This causes the training process to crash with an IOError or PermissionError, achieving denial-of-service against the victim's compute job — wasting GPU time and potentially corrupting incomplete checkpoint files if the crash occurs mid-save.

CVSS Vector

CVSS:3.1/AV:L/AC:H/PR:L/UI:N/S:U/C:N/I:N/A:H

Timeline

Published
May 18, 2023
Last Modified
November 22, 2024
First Seen
May 18, 2023

Related Vulnerabilities