PROBABILISTIC CONNECTION IMPORTANCE INFERENCE AND LOSSLESS COMPRESSION OF DEEP NEURAL NETWORKS
Document Type
Conference Proceeding
Publication Date
1-1-2020
Abstract
Deep neural networks (DNNs) can be huge in size, requiring a considerable amount of energy and computational resources to operate, which limits their applications in numerous scenarios. It is thus of interest to compress DNNs while maintaining their performance levels. We here propose a probabilistic importance inference approach for pruning DNNs. Specifically, we test the significance of the relevance of a connection in a DNN to the DNN's outputs using a nonparemetric scoring test and keep only those significant ones. Experimental results show that the proposed approach achieves better lossless compression rates than existing techniques.
Identifier
85146850660 (Scopus)
Publication Title
8th International Conference on Learning Representations Iclr 2020
Grant
OAC 1920147
Fund Ref
National Science Foundation
Recommended Citation
Xing, Xin; Sha, Long; Hong, Pengyu; Shang, Zuofeng; and Liu, Jun S., "PROBABILISTIC CONNECTION IMPORTANCE INFERENCE AND LOSSLESS COMPRESSION OF DEEP NEURAL NETWORKS" (2020). Faculty Publications. 5668.
https://digitalcommons.njit.edu/fac_pubs/5668
