Quantization-Unaware Double JPEG Compression Detection (original) (raw)

A novel forensic image analysis tool for discovering double JPEG compression clues

Multimedia Tools and Applications, 2016

This paper presents a novel technique to discover double JPEG compression traces. Existing detectors only operate in a scenario that the image under investigation is explicitly available in JPEG format. Consequently, if quantization information of JPEG files is unknown, their performance dramatically degrades. Our method addresses both forensic scenarios which results in a fresh perceptual detection pipeline. We suggest a dimensionality reduction algorithm to visualize behaviors of a big database including various single and double compressed images. Based on intuitions of visualization, three bottom-up, top-down and combined top-down/bottom-up learning strategies are proposed. Our tool discriminates single compressed images from double counterparts, estimates the first quantization in double compression, and localizes tampered regions in a forgery examination. Extensive experiments on three databases demonstrate results are robust among different quality levels.

A robust JPEG compression detector for image forensics

Signal Processing: Image Communication, 2020

Identification of JPEG compressed images saved in uncompressed format (JPEG-U images) is an important issue in forensic analysis. The state-of-the-art JPEG compression detection methods fail to identify such images when subjected to post-processing/anti-forensic operations. In this paper, we propose a novel JPEG compression detector which is robust to post-processing and anti-forensic operations. The detector is based on the difference in the discrete cosine transform (DCT) coefficient distributions in the ac subbands of uncompressed images and JPEG-U images. We show theoretically and empirically that the probability of subband DCT coefficients which lie in the interval (−0.5, 0.5) is significantly different for a JPEG-U and the corresponding uncompressed image. This difference is exploited to derive a detection statistic which is compared with a threshold to detect JPEG-U images. The detector makes use of calibration, a technique used in steganalysis, to obtain the detection statistic. The experimental results show that the proposed detector significantly outperforms the state-of-the-art detectors, especially in the presence of post-processing and anti-forensic operations.

First quantization coefficient extraction from double compressed JPEG images

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2013

In the forensics domain can be useful to recover image history, and in particular whether or not it has been doubly compressed. Clarify this point allows to assess if, in addition to the compression at the time of shooting, the picture was decompressed and then resaved. This is not a clear indication of forgery, but it can justify further investigations. In this paper we propose a novel technique able to retrieve the coefficients of the first compression in a double compressed JPEG image when the second compression is lighter than the first one. The proposed approach exploits the effects of successive quantizations followed by dequantization to recover the original compression parameters. Experimental results and comparisons with a state of the art method confirm the effectiveness of the proposed approach.

First Quantization Matrix Estimation From Double Compressed JPEG Images

IEEE Transactions on Information Forensics and Security, 2014

One of the most common problems in the Image Forensics field is the reconstruction of the history of an image or a video. The data related to the characteristics of the camera that carried out the shooting, together with the reconstruction of the (possible) further processing, allow to have some useful hints about the originality of the visual document under analysis. As example, if an image has been subjected to more than one JPEG compression, we can state that the considered image is not the exact bitstream generated by the camera at the time of shooting. It is then useful to estimate the quantization steps of the first compression, which, in case of JPEG images edited and then saved again in the same format, are no more available in the embedded metadata. In this paper we present a novel algorithm to achieve this goal in case of double JPEG compressed images. The proposed approach copes with the case when the second quantization step is lower than the first one, exploiting the effects of successive quantizations followed by dequantizations. To improve the results of the estimation, a proper filtering strategy together with a function devoted to find the first quantization step, have been designed. Experimental results and comparisons with the state-of-the-art methods, confirm the effectiveness of the proposed approach.

Estimating Previous Quantization Factors on Multiple JPEG Compressed Images

2021

The JPEG compression algorithm has proven to be efficient in saving storage and preserving image quality thus becoming extremely popular. On the other hand, the overall process leaves traces into encoded signals which are typically exploited for forensic purposes: for instance, the compression parameters of the acquisition device (or editing software) could be inferred. To this aim, in this paper a novel technique to estimate “previous” JPEG quantization factors on images compressed multiple times, in the aligned case by analyzing statistical traces hidden on Discrete Cosine Transform (DCT) histograms is exploited. Experimental results on double, triple and quadruple compressed images, demonstrate the effectiveness of the proposed technique while unveiling further interesting insights.

Double JPEG Compression Detection Using Statistical Analysis

Nowadays with advancement of technology, tampering of digital images using computer and advanced software packages like Photoshop has become a simple task. Many algorithms have been proposed to detect tampered images that have been kept developing. In this regard, verification of accuracy of image content and detection of manipulations in image regardless of any previous knowledge about the image content and can be an important research field. Recently, many efforts have been made in the area of image forensics, especially passive algorithms for detecting tampered images. JPEG format is one of the most common formats used for image compression. Hence, JPEG images are subjected to attacks such as manipulation and cropping. Since single compressed and double compressed JPEG images contain blocking artifacts, therefore these images can be detected by assessment of these artifacts. JPEG artifacts will be not aligned in double compressed images which have been manipulated. This paper intends to examine challenges existing in blocking artifact extraction and improve the detection of double compressed JPEG images. Results of experiments show that new proposed approach has a proper functionality.

Estimating JPEG2000 compression for image forensics using Benford's Law

SPIE Proceedings, 2010

With the tremendous growth and usage of digital images nowadays, the integrity and authenticity of digital content is becoming increasingly important, and a growing concern to many government and commercial sectors. Image Forensics, based on a passive statistical analysis of the image data only, is an alternative approach to the active embedding of data associated with Digital Watermarking. Benford's Law was first introduced to analyse the probability distribution of the 1 st digit (1-9) numbers of natural data, and has since been applied to Accounting Forensics for detecting fraudulent income tax returns [9]. More recently, Benford's Law has been further applied to image processing and image forensics. For example, Fu et al. [5] proposed a Generalised Benford's Law technique for estimating the Quality Factor (QF) of JPEG compressed images. In our previous work, we proposed a framework incorporating the Generalised Benford's Law to accurately detect unknown JPEG compression rates of watermarked images in semi-fragile watermarking schemes. JPEG2000 (a relatively new image compression standard) offers higher compression rates and better image quality as compared to JPEG compression. In this paper, we propose the novel use of Benford's Law for estimating JPEG2000 compression for image forensics applications. By analysing the DWT coefficients and JPEG2000 compression on 1338 test images, the initial results indicate that the 1 st digit probability of DWT coefficients follow the Benford's Law. The unknown JPEG2000 compression rates of the image can also be derived, and proved with the help of a divergence factor, which shows the deviation between the probabilities and Benford's Law. Based on 1338 test images, the mean divergence for DWT coefficients is approximately 0.0016, which is lower than DCT coefficients at 0.0034. However, the mean divergence for JPEG2000 images compression rate at 0.1 is 0.0108, which is much higher than uncompressed DWT coefficients. This result clearly indicates a presence of compression in the image. Moreover, we compare the results of 1 st digit probability and divergence among JPEG2000 compression rates at 0.1, 0.3, 0.5 and 0.9. The initial results show that the expected difference among them could be used for further analysis to estimate the unknown JPEG2000 compression rates.

Simplified Anti-Forensics of JPEG Compression

Journal of Computers, 2013

This paper proposes a simplified anti-forensics method for JPEG compression. For a spatial image decompressed from a JPEG file, traces of compression can be tracked by many forensic methods. To conceal these clues of comblike DCT histogram and blocking artifacts, we use the method of image enhancement and filtering. Compared with Stamm's method of introducing noise into the targeting image, the proposed method preserves better quality and works faster. Risks of quantization estimation and global histogram analysis can also be avoided.

Detecting double compressed JPEG images

3rd International Conference on Imaging for Crime Detection and Prevention (ICDP 2009), 2009

Verifying the integrity of digital images and detecting the traces of tampering without using any protecting pre-extracted or pre-embedded information has an important role in image forensics and crime detection. When altering a JPEG image, typically it is loaded into a photo-editing software and after manipulations are carried out, the image is re-saved. This operation, typically, brings into the image specific artifacts. In this paper we focus on these artifacts and propose an automatic method capable of detecting them.

DETECTABLE TAMPERING OF JPEG ANTI-FORENSICS Gamal Fahmy Electrical Engineering

2012

Many forensic techniques recently tried to detect the tampering and manipulation of JPEG compressed images that became a critical problem in image authentication and origin tracking. Some techniques indicated that a knowledgeable attacker can make it very hard to trace the image origin, while others indicated that portions of the compressed image that has been compressed at different quality factor quantization matrices are distinguishable if they are recompressed at a higher quality factor quantization matrix (with less quantization steps). In this paper, we pursue the idea of recompressing forensically suspect-able images with different compression parameters. We use different quantization matrix sizes that would indicate a DCT projection at different frequencies (horizontally, vertically, and diagonally), and would make it easier to track any tampering or hacking footprints. We show that a JPEG compressed image can make these footprints distinguishable if recompressed with a smal...