--- title: Schedules keywords: fastai sidebar: home_sidebar summary: "When should I prune my network ?" description: "When should I prune my network ?" nb_path: "nbs/0c_schedules.ipynb" ---
The easiest schedule is the one-shot pruning, i.e. prune the network once. This can be done by simply returning the desired sparsity value. The moment when you want to prune will be controlled by the start_epoch
argument in the SparsifyCallback
.
plot_sched(np.concatenate([train, os]), label='One-Shot')
Instead of pruning the network to desired sparsity in one step, you can do it iteratively. In fasterai, you can change the amount of iterations
plot_sched(np.concatenate([train, iterative(0,50, prune)]), label='Iterative')
To modify the default n_steps
, you can use the partial
function.
iterative = partial(iterative, n_steps=5)
plot_sched(np.concatenate([train, iterative(0,50, prune)]), label='Iterative')
Some researchers have come up with more sophisticated schedules, such as the Automated Gradual Pruning.
plot_sched(np.concatenate([train, sched_agp(0,50, prune)]), label='AGP')
plot_sched(sched_onecycle(0,50, prune), label='One-Cycle')
You can also create even more interesting behaviours such as the DSD method, where you prune the model in the first place, then re-grow it to its initial amount of parameter.
def dsd(start, end, pos):
if pos<0.5:
return start + (1 + math.cos(math.pi*(1-pos*2))) * (end-start) / 2
else:
return end + (1 - math.cos(math.pi*(1-pos*2))) * (start-end) / 2
plot_sched(np.concatenate([train, dsd]), label='DSD')