An ELU Network with Total Variation for Image Denoising

Tianyang Wang, Zhengrui Qin, Michelle Zhu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

In this paper, we propose a novel convolutional neural network (CNN) for image denoising, which uses exponential linear unit (ELU) as the activation function. We investigate the suitability by analyzing ELU’s connection with trainable nonlinear reaction diffusion model (TNRD) and residual denoising. On the other hand, batch normalization (BN) is indispensable for residual denoising and convergence purpose. However, direct stacking of BN and ELU degrades the performance of CNN. To mitigate this issue, we design an innovative combination of activation layer and normalization layer to exploit and leverage the ELU network, and discuss the corresponding rationale. Moreover, inspired by the fact that minimizing total variation (TV) can be applied to image denoising, we propose a TV regularized L2 loss to evaluate the training effect during the iterations. Finally, we conduct extensive experiments, showing that our model outperforms some recent and popular approaches on Gaussian denoising with specific or randomized noise levels for both gray and color images.

Original languageEnglish
Title of host publicationNeural Information Processing - 24th International Conference, ICONIP 2017, Proceedings
EditorsDerong Liu, Shengli Xie, El-Sayed M. El-Alfy, Dongbin Zhao, Yuanqing Li
PublisherSpringer Verlag
Pages227-237
Number of pages11
ISBN (Print)9783319700892
DOIs
StatePublished - 1 Jan 2017
Event24th International Conference on Neural Information Processing, ICONIP 2017 - Guangzhou, China
Duration: 14 Nov 201718 Nov 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10636 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other24th International Conference on Neural Information Processing, ICONIP 2017
CountryChina
CityGuangzhou
Period14/11/1718/11/17

Fingerprint

Image denoising
Image Denoising
Total Variation
Denoising
Normalization
Chemical activation
Neural networks
Batch
Unit
Neural Networks
Reaction-diffusion Model
Activation Function
Stacking
Color Image
Color
Leverage
Nonlinear Model
Activation
Iteration
Evaluate

Keywords

  • Convolutional neural network
  • Deep learning
  • ELU
  • Image denoising
  • Image processing
  • Total variation

Cite this

Wang, T., Qin, Z., & Zhu, M. (2017). An ELU Network with Total Variation for Image Denoising. In D. Liu, S. Xie, E-S. M. El-Alfy, D. Zhao, & Y. Li (Eds.), Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings (pp. 227-237). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10636 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-319-70090-8_24
Wang, Tianyang ; Qin, Zhengrui ; Zhu, Michelle. / An ELU Network with Total Variation for Image Denoising. Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings. editor / Derong Liu ; Shengli Xie ; El-Sayed M. El-Alfy ; Dongbin Zhao ; Yuanqing Li. Springer Verlag, 2017. pp. 227-237 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{909c791aea2646b8a59586e84f8b4558,
title = "An ELU Network with Total Variation for Image Denoising",
abstract = "In this paper, we propose a novel convolutional neural network (CNN) for image denoising, which uses exponential linear unit (ELU) as the activation function. We investigate the suitability by analyzing ELU’s connection with trainable nonlinear reaction diffusion model (TNRD) and residual denoising. On the other hand, batch normalization (BN) is indispensable for residual denoising and convergence purpose. However, direct stacking of BN and ELU degrades the performance of CNN. To mitigate this issue, we design an innovative combination of activation layer and normalization layer to exploit and leverage the ELU network, and discuss the corresponding rationale. Moreover, inspired by the fact that minimizing total variation (TV) can be applied to image denoising, we propose a TV regularized L2 loss to evaluate the training effect during the iterations. Finally, we conduct extensive experiments, showing that our model outperforms some recent and popular approaches on Gaussian denoising with specific or randomized noise levels for both gray and color images.",
keywords = "Convolutional neural network, Deep learning, ELU, Image denoising, Image processing, Total variation",
author = "Tianyang Wang and Zhengrui Qin and Michelle Zhu",
year = "2017",
month = "1",
day = "1",
doi = "10.1007/978-3-319-70090-8_24",
language = "English",
isbn = "9783319700892",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "227--237",
editor = "Derong Liu and Shengli Xie and El-Alfy, {El-Sayed M.} and Dongbin Zhao and Yuanqing Li",
booktitle = "Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings",

}

Wang, T, Qin, Z & Zhu, M 2017, An ELU Network with Total Variation for Image Denoising. in D Liu, S Xie, E-SM El-Alfy, D Zhao & Y Li (eds), Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10636 LNCS, Springer Verlag, pp. 227-237, 24th International Conference on Neural Information Processing, ICONIP 2017, Guangzhou, China, 14/11/17. https://doi.org/10.1007/978-3-319-70090-8_24

An ELU Network with Total Variation for Image Denoising. / Wang, Tianyang; Qin, Zhengrui; Zhu, Michelle.

Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings. ed. / Derong Liu; Shengli Xie; El-Sayed M. El-Alfy; Dongbin Zhao; Yuanqing Li. Springer Verlag, 2017. p. 227-237 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10636 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - An ELU Network with Total Variation for Image Denoising

AU - Wang, Tianyang

AU - Qin, Zhengrui

AU - Zhu, Michelle

PY - 2017/1/1

Y1 - 2017/1/1

N2 - In this paper, we propose a novel convolutional neural network (CNN) for image denoising, which uses exponential linear unit (ELU) as the activation function. We investigate the suitability by analyzing ELU’s connection with trainable nonlinear reaction diffusion model (TNRD) and residual denoising. On the other hand, batch normalization (BN) is indispensable for residual denoising and convergence purpose. However, direct stacking of BN and ELU degrades the performance of CNN. To mitigate this issue, we design an innovative combination of activation layer and normalization layer to exploit and leverage the ELU network, and discuss the corresponding rationale. Moreover, inspired by the fact that minimizing total variation (TV) can be applied to image denoising, we propose a TV regularized L2 loss to evaluate the training effect during the iterations. Finally, we conduct extensive experiments, showing that our model outperforms some recent and popular approaches on Gaussian denoising with specific or randomized noise levels for both gray and color images.

AB - In this paper, we propose a novel convolutional neural network (CNN) for image denoising, which uses exponential linear unit (ELU) as the activation function. We investigate the suitability by analyzing ELU’s connection with trainable nonlinear reaction diffusion model (TNRD) and residual denoising. On the other hand, batch normalization (BN) is indispensable for residual denoising and convergence purpose. However, direct stacking of BN and ELU degrades the performance of CNN. To mitigate this issue, we design an innovative combination of activation layer and normalization layer to exploit and leverage the ELU network, and discuss the corresponding rationale. Moreover, inspired by the fact that minimizing total variation (TV) can be applied to image denoising, we propose a TV regularized L2 loss to evaluate the training effect during the iterations. Finally, we conduct extensive experiments, showing that our model outperforms some recent and popular approaches on Gaussian denoising with specific or randomized noise levels for both gray and color images.

KW - Convolutional neural network

KW - Deep learning

KW - ELU

KW - Image denoising

KW - Image processing

KW - Total variation

UR - http://www.scopus.com/inward/record.url?scp=85035215248&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-70090-8_24

DO - 10.1007/978-3-319-70090-8_24

M3 - Conference contribution

AN - SCOPUS:85035215248

SN - 9783319700892

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 227

EP - 237

BT - Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings

A2 - Liu, Derong

A2 - Xie, Shengli

A2 - El-Alfy, El-Sayed M.

A2 - Zhao, Dongbin

A2 - Li, Yuanqing

PB - Springer Verlag

ER -

Wang T, Qin Z, Zhu M. An ELU Network with Total Variation for Image Denoising. In Liu D, Xie S, El-Alfy E-SM, Zhao D, Li Y, editors, Neural Information Processing - 24th International Conference, ICONIP 2017, Proceedings. Springer Verlag. 2017. p. 227-237. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-70090-8_24