--- title: Knowledge Distillation keywords: fastai sidebar: home_sidebar summary: "How to apply knowledge distillation with fasterai" description: "How to apply knowledge distillation with fasterai" nb_path: "nbs/04b_tutorial.knowledge_distillation.ipynb" ---
We'll illustrate how to use Knowledge Distillation to distill the knowledge of a Resnet34 (the teacher), to a Resnet18 (the student)
Let's us grab some data
path = untar_data(URLs.PETS)
files = get_image_files(path/"images")
def label_func(f): return f[0].isupper()
dls = ImageDataLoaders.from_name_func(path, files, label_func, item_tfms=Resize(64))
The first step is then to train the teacher model. We'll start from a pretrained model, ensuring to get good results on our dataset.
teacher = cnn_learner(dls, resnet34, metrics=accuracy)
teacher.unfreeze()
teacher.fit_one_cycle(5)
We'll now train a Resnet18 from scratch, and without any help from the teacher model, to get that as a baseline
student = Learner(dls, resnet18(num_classes=2), metrics=accuracy)
student.fit_one_cycle(5)
And now we train the same model, but with the help of the teacher.
loss = partial(SoftTarget, T=30)
student = Learner(dls, resnet18(num_classes=2), metrics=accuracy)
kd = KnowledgeDistillation(teacher, loss)
student.fit_one_cycle(5, cbs=kd)
When helped, the student model performs better !