Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/66929
Title: Steganalysis via deep residual network
Authors: Wu, ST
Zhong, SH
Liu, Y 
Keywords: Steganalysis
Convolutional neural network
Deep residual network
Residual learning
Steganography
Issue Date: 2016
Publisher: Institute of Electrical and Electronics Engineers
Source: 2016 IEEE 22nd International Conference on Parallel and Distributed Systems (ICPADS), Dec 13-16, 2016, Wuhan, People's Republic of China, p. 1233-1236 How to cite?
Abstract: Recent studies have demonstrated that a well designed deep convolutional neural network (CNN) model achieves competitive performances on detecting the presence of secret message in digital images, compared with the classical rich model based steganalysis. In this paper, we propose to investigate a category of very deep CNN model-the deep residual network (DRN), for steganalysis. DRN is suitable for steganalysis from two aspects. For the first, the DRN model usually contains a large number of network layers, which proves to be effective to capture the complex statistics of digital images. For the second, DRN's residual learning (ResL) method actively strengthens the signal coming from secret messages, which is extremely beneficial for the discrimination between cover images and stego images. Comprehensive experiments on standard dataset show that the DRN model achieves very low detection error rates for the state of arts steganographic algorithms. It also outperforms the classical rich model method and several recently proposed CNN based methods.
Description: 22nd IEEE International Conference on Parallel and Distributed Systems (ICPADS), Wuhan, PEOPLES R CHINA, DEC 13-16, 2016
URI: http://hdl.handle.net/10397/66929
ISBN: 978-1-5090-4457-3
ISSN: 1521-9097
DOI: 10.1109/ICPADS.2016.165
Appears in Collections:Conference Paper

Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

59
Checked on Nov 20, 2017

Google ScholarTM

Check

Altmetric



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.