TY - GEN
T1 - An ELU Network with Total Variation for Image Denoising
AU - Wang, Tianyang
AU - Qin, Zhengrui
AU - Zhu, Michelle
N1 - Publisher Copyright:
© 2017, Springer International Publishing AG.
PY - 2017
Y1 - 2017
N2 - In this paper, we propose a novel convolutional neural network (CNN) for image denoising, which uses exponential linear unit (ELU) as the activation function. We investigate the suitability by analyzing ELU’s connection with trainable nonlinear reaction diffusion model (TNRD) and residual denoising. On the other hand, batch normalization (BN) is indispensable for residual denoising and convergence purpose. However, direct stacking of BN and ELU degrades the performance of CNN. To mitigate this issue, we design an innovative combination of activation layer and normalization layer to exploit and leverage the ELU network, and discuss the corresponding rationale. Moreover, inspired by the fact that minimizing total variation (TV) can be applied to image denoising, we propose a TV regularized L2 loss to evaluate the training effect during the iterations. Finally, we conduct extensive experiments, showing that our model outperforms some recent and popular approaches on Gaussian denoising with specific or randomized noise levels for both gray and color images.
AB - In this paper, we propose a novel convolutional neural network (CNN) for image denoising, which uses exponential linear unit (ELU) as the activation function. We investigate the suitability by analyzing ELU’s connection with trainable nonlinear reaction diffusion model (TNRD) and residual denoising. On the other hand, batch normalization (BN) is indispensable for residual denoising and convergence purpose. However, direct stacking of BN and ELU degrades the performance of CNN. To mitigate this issue, we design an innovative combination of activation layer and normalization layer to exploit and leverage the ELU network, and discuss the corresponding rationale. Moreover, inspired by the fact that minimizing total variation (TV) can be applied to image denoising, we propose a TV regularized L2 loss to evaluate the training effect during the iterations. Finally, we conduct extensive experiments, showing that our model outperforms some recent and popular approaches on Gaussian denoising with specific or randomized noise levels for both gray and color images.
KW - Convolutional neural network
KW - Deep learning
KW - ELU
KW - Image denoising
KW - Image processing
KW - Total variation
UR - http://www.scopus.com/inward/record.url?scp=85035215248&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-70090-8_24
DO - 10.1007/978-3-319-70090-8_24
M3 - Conference contribution
AN - SCOPUS:85035215248
SN - 9783319700892
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 227
EP - 237
BT - Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings
A2 - Liu, Derong
A2 - Xie, Shengli
A2 - El-Alfy, El-Sayed M.
A2 - Zhao, Dongbin
A2 - Li, Yuanqing
PB - Springer Verlag
T2 - 24th International Conference on Neural Information Processing, ICONIP 2017
Y2 - 14 November 2017 through 18 November 2017
ER -