-
Notifications
You must be signed in to change notification settings - Fork 25
Description
Hi, @jdmartinev, I'd like to report that a potentially risky method is being used in this project, which may pose deserialization threats. Please check the following code example:
• Clase04/imageclassifier_fastapi/app.py
from fastai.learner import load_learner
learner = load_learner(hf_hub_download("jdmartinev/intel_image_classification_fastai","model.pkl"))
Issue Description
As shown above, in the imageclassifier_fastapi/app.py file, the file "model.pkl" is downloaded and loaded by the the load_learner method in fastai library .
In fact, the method is not secure. The load_learner method uses torch.load with the pickle module as its core loading mechanism, and the documentation for both explicitly warns that using them may lead to arbitrary code execution risks.
The official document says "load_learner uses Python's insecure pickle module, which can execute malicious arbitrary code when loading. Only load files you trust.If you only need to load model weights and optimizer state, use the safe Learner.load instead."
Related Risk Reports::fastai document torch.load
Suggested Repair Methods
- I recommend using learner.load_model or `learner.loadmethod of fastai library, the officially recommended and safer loading function, for handling pickle files instead.
As one popular opensource machine learning projects, every potential risk could be propagated and amplified. Could you please address the above issues?
Thanks for your help~
Best regards,
Rockstars