AutoPrune:通过调整辅助参数自动修剪网络

上传:qqwell-known26024 浏览: 17 推荐: 0 文件:.pdf 大小:339.72 KB 上传时间:2021-01-22 15:59:52 版权申诉

减少模型冗余是将复杂的深度学习模型部署到资源受限或时间敏感的设备的一项重要任务。直接正则化或修改权重值会使修剪过程不那么健壮,并且对超参数的选择不敏感,并且还需要先验知识才能针对不同模型调整不同的超参数。..

AutoPrune: Automatic Network Pruning by Regularizing Auxiliary Parameters

Reducing the model redundancy is an important task to deploy complex deep learning models to resource-limited or time-sensitive devices. Directly regularizing or modifying weight values makes pruning procedure less robust and sensitive to the choice of hyperparameters, and it also requires prior knowledge to tune different hyperparameters for different models.To build a better generalized and easy-to-use pruning method, we propose AutoPrune, which prunes the network through optimizing a set of trainable auxiliary parameters instead of original weights. The instability and noise during training on auxiliary parameters will not directly affect weight values, which makes pruning process more robust to noise and less sensitive to hyperparameters. Moreover, we design gradient update rules for auxiliary parameters to keep them consistent with pruning tasks. Our method can automatically eliminate network redundancy with recoverability, relieving the complicated prior knowledge required to design thresholding functions, and reducing the time for trial and error. We evaluate our method with LeNet and VGG-like on MNIST and CIFAR-10 datasets, and with AlexNet, ResNet and MobileNet on ImageNet to establish the scalability of our work. Results show that our model achieves state-of-the-art sparsity, e.g. 7%, 23% FLOPs and 310x, 75x compression ratio for LeNet5 and VGG-like structure without accuracy drop, and 200M and 100M FLOPs for MobileNet V2 with accuracy 73.32% and 66.83% respectively.

上传资源
用户评论