CVE-2026-33660

GHSA-58qr-rcgv-642v CRITICAL

TensorFlow: type confusion NPD in tensor conversion

Published March 25, 2026
CISO Take

Any TensorFlow deployment running versions before 2.5.0 (or backport patches) is vulnerable to a type confusion crash—on shared training infrastructure or multi-tenant ML platforms this can be weaponized by a low-privileged local user to crash pipelines or potentially escalate. Patch immediately to TF 2.5.0, 2.4.2, 2.3.3, 2.2.3, or 2.1.4 depending on your pin. If patching is not immediate, enforce strict input validation at pipeline ingestion boundaries to reject non-numeric tensor types before they reach TF ops.

Affected Systems

Package Ecosystem Vulnerable Range Patched
n8n npm = 2.14.0 2.14.1

Do you use n8n? You're affected.

Severity & Risk

CVSS 3.1
10.0 / 10
EPSS
0.1%
chance of exploitation in 30 days
KEV Status
Not in KEV
Sophistication
Moderate

Recommended Action

  1. 1. Patch: upgrade to TF 2.5.0 or cherry-picked backports: 2.4.2, 2.3.3, 2.2.3, 2.1.4. 2. Validate tensor dtype at pipeline ingestion—reject non-numeric dtypes (strings, objects, booleans) before passing to TF ops using tf.debugging.assert_type() or equivalent guards. 3. Isolate TF runtimes on shared infrastructure using containers with separate user namespaces; prevent cross-tenant process access. 4. For TF Serving deployments, implement a preprocessing layer (e.g., input signature enforcement via SavedModel signatures) that restricts accepted dtypes to declared numeric types. 5. Detection: monitor for process crashes (SIGSEGV/SIGABRT) in TF serving or training processes—repeated crashes from the same input source are an exploitation signal.

Classification

Compliance Impact

This CVE is relevant to:

EU AI Act
Article 15 - Accuracy, robustness and cybersecurity Article 9 - Risk management system
ISO 42001
8.4 - AI system risk management 9.1 - Monitoring, measurement, analysis and evaluation
NIST AI RMF
GOVERN 1.1 - Policies and procedures for AI risk management MANAGE 2.2 - Mechanisms for responding to and recovering from AI risks

Technical Details

NVD Description

n8n is an open source workflow automation platform. Prior to versions 2.14.1, 2.13.3, and 1.123.26, an authenticated user with permission to create or modify workflows could use the Merge node's "Combine by SQL" mode to read local files on the n8n host and achieve remote code execution. The AlaSQL sandbox did not sufficiently restrict certain SQL statements, allowing an attacker to access sensitive files on the server or even compromise the instance. The issue has been fixed in n8n versions 2.14.1, 2.13.3, and 1.123.26. Users should upgrade to one of these versions or later to remediate the vulnerability. If upgrading is not immediately possible, administrators should consider the following temporary mitigations: Limit workflow creation and editing permissions to fully trusted users only, and/or disable the Merge node by adding `n8n-nodes-base.merge` to the `NODES_EXCLUDE` environment variable. These workarounds do not fully remediate the risk and should only be used as short-term mitigation measures.

Exploitation Scenario

An adversary with low-privileged local access to a shared ML training cluster (e.g., a data scientist account on a JupyterHub or a compromised CI/CD pipeline runner) crafts a Python script that calls TF operations passing string or boolean tensors where float32/int32 are expected. The ndarray_tensor.cc conversion logic fails to reject the type mismatch and dereferences a null pointer, crashing the TF process. On a multi-tenant training server, repeated targeted crashes can disrupt competing users' training jobs (DoS). In a more sophisticated scenario, the type confusion primitive (CWE-843) could be chained with a heap grooming technique to achieve controlled memory writes, escalating to arbitrary code execution within the TF worker process and potentially breaking container isolation.

CVSS Vector

CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:C/C:H/I:H/A:H

Timeline

Published
March 25, 2026
Last Modified
March 26, 2026
First Seen
March 25, 2026