Download PDFOpen PDF in browserSubjective Feedback-based Neural Network Pruning for Speech EnhancementEasyChair Preprint 20565 pages•Date: November 30, 2019AbstractSpeech enhancement based on neural networks provides performance superior to that of conventional algorithms. However, the network may suffer owing to redundant parameters, which demands large unnecessary computation and power consumption. This work aimed to prune the large network by removing extra neurons and connections while maintaining speech enhancement performance. Iterative network pruning combined with network retraining was employed to compress the network based on the weight magnitude of neurons and connections. This pruning method was evaluated using a deep denoising autoencoder neural network, which was trained to enhance speech perception under nonstationary noise interference. Word correct rate was utilized as the subjective intelligibility feedback to evaluate the understanding of noisy speech enhanced by the sparse network. Results showed that the iterative pruning method combined with retraining could reduce 50% of the parameters without significantly affecting the speech enhancement performance, which was superior to the two baseline conditions of direct network pruning with network retraining and iterative network pruning without network retraining. Finally, an optimized network pruning method was proposed to implement the iterative network pruning and retraining in a greedy repetition manner, yielding a maximum pruning ratio of 80%. Keyphrases: Subjective feedback, network pruning, neural network, speech enhancement
|