Skip to content

mzikkhan/Ensemble-Structural-Knowledge-Distillation-and-Quantization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ensemble Structural Knowledge Distillation and Quantization

This repository consists of training scripts modified from torchvision's vision repository to carry out a novel framework of Ensemble Structural Knowledge Distillation to train a Faster-RCNN model with a ResNet-18 backbone using two Faster-RCNN models, one with a ResNet-50 backbone and the other with a ResNet-101 backbone.

Post-Training Quantization was carried out to condense the model further to make it edge usable.

This research project was undertaken for my university course CSE465: Neural Networks and Pattern Recognition

About

Experiments to carry out Ensemble Structural Knowledge Distillation on Faster-RCNN models and then perform Post-Training Quantization.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors