letter - kaisthklee.kaist.ac.kr/publications/ieee trans. on... · letter detection of hue...

7
1826 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 27, NO. 8, AUGUST 2017 Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu Lee Abstract—Hue modification is a common strategy used to distort the true meaning of a digital image. In order to detect this kind of image forgery, we proposed a robust forensics scheme for detecting hue modifi- cation. First, we pointed out that photo response nonuniformity (PRNU) separated by a color filter array forms a pattern independent of others, since the position of each PRNU pixel means that they do not overlap. Using PRNUs from each color channel of an image, we designed a forensic scheme for estimating hue modification. We also proposed an efficient estimation scheme and an algorithm for detecting partial manipulation. The results confirmed that the proposed method distinguishes hue modification and estimates the degree of change; moreover, it is resistant to the effects of common image processing. Index Terms—Color filter array (CFA), digital image forensics, hue, photo response nonuniformity (PRNU), sensor pattern noise. I. I NTRODUCTION With the advent of high-quality, low-cost, easily accessible image editing tools, digital images can be easily modified not only by highly trained professionals, but also by most average digital camera users. One of the common strategies that image pirates use with digital images is hue modification. Hue is the main property of a color; therefore, counterfeiters who attempt to tamper with a color attribute most commonly tamper with the hue. With an image-editing tool, a person can severely distort the actual meanings of images by modifying the hue of the images. In second-hand markets, such as eBay and Amazon.com, counterfeiters can take unfair profits by changing the color of their merchandise. In addition, with severe hue modification, media may broadcast perverted versions of a particular accident by changing the hue of images that were shot at the accident site. For example, the German-language daily tabloid, Blick, forged an image by changing the color of the flooding water to red so that it appeared to be blood, and then distributed the falsified image to news channels. To cope with image forgeries, a number of forensic schemes were proposed recently. Most schemes are based on detecting local incon- sistencies such as resampling artifacts [1], color filter array (CFA) interpolation artifacts [2], JPEG compression [3], or lighting con- dition [4]. The pixel photo response nonuniformity (PRNU) is also widely used for detecting digital image forgeries [5]–[9]. There are also some methods for detecting identical regions caused by copy– move forgery [10]–[13]. However, only two of these methods [2], [14] are able to detect hue forgery, because this does not change any other aspect of an image, including edges, shapes, gradations, and PRNU. The authors are with the School of Computing, Korea Advanced Insti- tute of Science and Technology, Daejeon 34141, South Korea (e-mail: [email protected]; [email protected]). Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/TCSVT.2016.2539828 Manuscript received November 9, 2015; revised January 15, 2016; accepted February 3, 2016. Date of publication March 9, 2016; date of current version August 2, 2017. This work was supported in part by the Ministry of Culture, Sports and Tourism and in part by the Korea Copyright Commission. The work of J.-U. Hou was supported by the Global Ph.D. Fellowship Program through the National Research Foundation of Korea within the Ministry of Education under Grant 2015H1A2A1030715. This paper was recommended by Associate Editor C. Shan. (Corresponding author: Heung-Kyu Lee.) Choi et al. [2] first proposed an algorithm for estimating hue mod- ification using a neighboring correlation [15] induced by a CFA in a digital camera. They proposed a simple measure of changes in CFA patterns based on counting the number of pixels whose neighbors satisfied the interpolation condition. However, this algorithm loses its accuracy after common image processing (e.g., resizing or JPEG compression), which completely breaks down the demosaicing trace of an original image. Moreover, Choi’s algorithm works only with image data processed using a CFA Bayer pattern [16] for which the pattern configuration is already known. This paper succeeds our previous idea [14], which described a naïve forensics scheme for estimating the degree of hue modifi- cation based on PRNU. The advanced points of this paper are as follows. First, we propose an efficient estimation scheme that skips some unnecessary intervals based on our hue-modification modeling. Second, we propose a local forgery detector in which two threshold values are adopted to determine the forged regions. Finally, we model the distribution of the test results, analyze the error ratio, and propose an equation with two thresholds to reduce the false-positive ratio. The rest of this paper is organized as follows. In Section II we explain the proposed method. Then, in Section III we test our method with various image data sets and in Section IV we present our conclusion. II. PROPOSED METHOD In this section, we describe a separated PRNU created by CFA that forms a pattern independently of the others. Based on our previous idea [14], we propose an enhanced algorithm for estimating hue modification using separated PRNU, as well as an algorithm for detecting partial manipulation in which two threshold values are adopted to reduce the false-positive ratio. We now describe this process in detail. A. Reference Patterns for Each Color Channel The raw output of the image sensor was separated into three color components by a CFA [17]. The PRNU, a unique fingerprint of a digital camera, was also separated using this process. Fig. 1 shows an example of PRNU separation by the Bayer pattern [16], the most widely used pattern in digital cameras. Each separated PRNU forms a pattern independently of the others, because the positions of each PRNU do not overlap. For example, as shown in Fig. 1, noise residual N00 is included in the red channel, but not in the blue or green channel. Therefore, the PRNU of each color channel represents a unique characteristic of each untampered color channel. In order to extract the PRNU, Lukáš et al. [18] proposed a scheme to obtain an approximation of the PRNU using a wavelet-based denoising filter. Using this method, we can obtain the PRNU of each color channel P c by averaging multiple untampered images I (k) c , where k = 1,..., N p and c ∈{r, g, b}, and where r, g, and b denote the red, green, and blue color channels, respectively. The obtained PRNU P c for each color channel c is used as a color-reference pattern, which represents a unique characteristic of each untampered 1051-8215 © 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

Upload: others

Post on 09-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

1826 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 27, NO. 8, AUGUST 2017

LetterDetection of Hue Modification Using

Photo Response NonuniformityJong-Uk Hou, Student Member, IEEE, and Heung-Kyu Lee

Abstract— Hue modification is a common strategy used to distort thetrue meaning of a digital image. In order to detect this kind of imageforgery, we proposed a robust forensics scheme for detecting hue modifi-cation. First, we pointed out that photo response nonuniformity (PRNU)separated by a color filter array forms a pattern independent of others,since the position of each PRNU pixel means that they do not overlap.Using PRNUs from each color channel of an image, we designed a forensicscheme for estimating hue modification. We also proposed an efficientestimation scheme and an algorithm for detecting partial manipulation.The results confirmed that the proposed method distinguishes huemodification and estimates the degree of change; moreover, it is resistantto the effects of common image processing.

Index Terms— Color filter array (CFA), digital image forensics,hue, photo response nonuniformity (PRNU), sensor pattern noise.

I. INTRODUCTION

With the advent of high-quality, low-cost, easily accessible imageediting tools, digital images can be easily modified not only byhighly trained professionals, but also by most average digital camerausers. One of the common strategies that image pirates use withdigital images is hue modification. Hue is the main property of acolor; therefore, counterfeiters who attempt to tamper with a colorattribute most commonly tamper with the hue. With an image-editingtool, a person can severely distort the actual meanings of imagesby modifying the hue of the images. In second-hand markets, suchas eBay and Amazon.com, counterfeiters can take unfair profits bychanging the color of their merchandise. In addition, with severe huemodification, media may broadcast perverted versions of a particularaccident by changing the hue of images that were shot at the accidentsite. For example, the German-language daily tabloid, Blick, forgedan image by changing the color of the flooding water to red so thatit appeared to be blood, and then distributed the falsified image tonews channels.

To cope with image forgeries, a number of forensic schemes wereproposed recently. Most schemes are based on detecting local incon-sistencies such as resampling artifacts [1], color filter array (CFA)interpolation artifacts [2], JPEG compression [3], or lighting con-dition [4]. The pixel photo response nonuniformity (PRNU) is alsowidely used for detecting digital image forgeries [5]–[9]. There arealso some methods for detecting identical regions caused by copy–move forgery [10]–[13]. However, only two of these methods [2], [14]are able to detect hue forgery, because this does not change any otheraspect of an image, including edges, shapes, gradations, and PRNU.

The authors are with the School of Computing, Korea Advanced Insti-tute of Science and Technology, Daejeon 34141, South Korea (e-mail:[email protected]; [email protected]).

Color versions of one or more of the figures in this paper are availableonline at http://ieeexplore.ieee.org.

Digital Object Identifier 10.1109/TCSVT.2016.2539828Manuscript received November 9, 2015; revised January 15, 2016; accepted

February 3, 2016. Date of publication March 9, 2016; date of current versionAugust 2, 2017. This work was supported in part by the Ministry of Culture,Sports and Tourism and in part by the Korea Copyright Commission. Thework of J.-U. Hou was supported by the Global Ph.D. Fellowship Programthrough the National Research Foundation of Korea within the Ministry ofEducation under Grant 2015H1A2A1030715. This paper was recommendedby Associate Editor C. Shan. (Corresponding author: Heung-Kyu Lee.)

Choi et al. [2] first proposed an algorithm for estimating hue mod-ification using a neighboring correlation [15] induced by a CFA in adigital camera. They proposed a simple measure of changes in CFApatterns based on counting the number of pixels whose neighborssatisfied the interpolation condition. However, this algorithm losesits accuracy after common image processing (e.g., resizing or JPEGcompression), which completely breaks down the demosaicing traceof an original image. Moreover, Choi’s algorithm works only withimage data processed using a CFA Bayer pattern [16] for which thepattern configuration is already known.

This paper succeeds our previous idea [14], which described anaïve forensics scheme for estimating the degree of hue modifi-cation based on PRNU. The advanced points of this paper are asfollows. First, we propose an efficient estimation scheme that skipssome unnecessary intervals based on our hue-modification modeling.Second, we propose a local forgery detector in which two thresholdvalues are adopted to determine the forged regions. Finally, we modelthe distribution of the test results, analyze the error ratio, and proposean equation with two thresholds to reduce the false-positive ratio.

The rest of this paper is organized as follows. In Section IIwe explain the proposed method. Then, in Section III we test ourmethod with various image data sets and in Section IV we presentour conclusion.

II. PROPOSED METHOD

In this section, we describe a separated PRNU created by CFA thatforms a pattern independently of the others. Based on our previousidea [14], we propose an enhanced algorithm for estimating huemodification using separated PRNU, as well as an algorithm fordetecting partial manipulation in which two threshold values areadopted to reduce the false-positive ratio. We now describe thisprocess in detail.

A. Reference Patterns for Each Color Channel

The raw output of the image sensor was separated into threecolor components by a CFA [17]. The PRNU, a unique fingerprintof a digital camera, was also separated using this process. Fig. 1shows an example of PRNU separation by the Bayer pattern [16],the most widely used pattern in digital cameras. Each separatedPRNU forms a pattern independently of the others, because thepositions of each PRNU do not overlap. For example, as shown inFig. 1, noise residual N00 is included in the red channel, but notin the blue or green channel. Therefore, the PRNU of each colorchannel represents a unique characteristic of each untampered colorchannel.

In order to extract the PRNU, Lukáš et al. [18] proposed a schemeto obtain an approximation of the PRNU using a wavelet-baseddenoising filter. Using this method, we can obtain the PRNU ofeach color channel Pc by averaging multiple untampered images I (k)

c ,where k = 1, . . . , Np and c ∈ {r, g, b}, and where r, g, and b denotethe red, green, and blue color channels, respectively. The obtainedPRNU Pc for each color channel c is used as a color-referencepattern, which represents a unique characteristic of each untampered

1051-8215 © 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.

Page 2: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

HOU AND LEE: DETECTION OF HUE MODIFICATION USING PRNU 1827

Fig. 1. PRNU separation by CFA Bayer pattern.

color channel. We noted that our method does not require a prioriknowledge about the CFA color configuration, similar to [19]. Aswith other PRNU-based methods [5]–[8], reference patterns shouldbe generated using the same camera that is used to take the suspiciousimage.

B. Proposed Estimation Algorithm

The naïve estimation process described in [14] is based on anexhaustive search algorithm, which is very time consuming. There-fore, we propose an enhanced scheme for efficiently estimatinghue modification. Using this search scheme, we can estimate huemodification without checking every possible case. We explain ourproposed algorithm in detail in the following discussion.

1) Modeling a Hue-Modification Effect of the PRNU: EachRGB pixel value of an image can be represented as a vec-tor with 3D Cartesian coordinates. In this representation of thepixel, a hue-modification process can be defined as a rota-tion of an RGB pixel vector, while preserving its magnitude.Given a unit vector �u = (ur , ug , ub), where ur = ug =ub = (1/

√3), the matrix R(θ) for a rotation by an angle of θ about

an axis in the direction of �u is given by

R(θ) =⎛⎝

α + β β − γ β + γ

β + γ α + β β − γ

β − γ β + γ α + β

⎞⎠ (1)

where α = cosθ, β = (1/3)(1 − cosθ), andγ = (1/

√3) sinθ [20]. Multiplying this matrix by the

original pixel vector �v = (r, g, b)T , we obtained ahue-modified pixel vector �v ′ = (r ′, g′, b′)T .

In order to analyze a hue-modification effect on theseparated PRNU, we analyzed this modification in terms of huerotations in RGB color space. First, we defined three pixel vectors�vr = [pr , p̃g , p̃b]T , �vg = [ p̃r , pg, p̃b]T , and �vb = [ p̃r , p̃g , pb]Tin RGB color space, which represent pixels from the red, green,and blue positions, respectively, of the CFA. pc represents the pixelvalue measured directly from the image sensor, and p̃c representsthe pixel value generated by the demosaicing process. To representmeaningful PRNU components, we removed the interpolatedvalue p̃c, which has a propagation error by demosaicing [19],and obtained normalized PRNU noise pixels as �ηr = [1, 0, 0]T ,

�ηg = [0, 1, 0]T, and �ηb = [0, 0, 1]T.Then, we multiplied each of the pixel vectors �ηr , �ηg, and �ηb by

rotation matrix R(θ) to simulate hue modification, and we obtainedthe hue-modified pixel vectors

�ηr =⎛⎝

α + β

β + γ

β − γ

⎞⎠, �ηg =

⎛⎝

β − γ

α + β

β + γ

⎞⎠, �ηb =

⎛⎝

β + γ

β − γ

α + β

⎞⎠ (2)

where θ is the degree of hue modification. Using these pixel vectors,we generated a 1× 3 synthetic noise image Ic = [η̄r , η̄g , η̄b] and a

Fig. 2. Calculated correlation values for all hue rotations, (a) measuredfrom the proposed model (

∑c ρ′c(θ)) and (b) measured from the real image

(∑

c ρc(θ)). Both graphs have a similar shape.

hue-modified synthetic noise image I ′c(θ) = [η̄r (θ), η̄g(θ), η̄b(θ)].Then, we calculated the correlation ρ′c(θ) as

ρ′c(θ) = corr(Ic(θ), I ′c(θ)). (3)

Fig. 2 shows the calculated correlation values for all hue rotationsfrom a real data set (Np = 30, Nikon D90) and for the proposedmodel. Here, we observed structural similarity between the proposedmodel

∑c ρ′c(θ) and

∑c ρc(θ) from the real data set.

2) Design a Hue-Modification Algorithm: From the previouslydescribed analysis, we observed that correlation graphs obtained fromthe real data set and those produced by the proposed model aresymmetric with respect to the maximum point and to the bell shape.Slopes around the maximum point converge toward it. Based onthese features, we designed an algorithm to estimate hue modificationwithout checking the degrees of all hue. The proposed algorithmconsists of two steps.

Step 1: The first step reduces computation time by removingunnecessary search intervals. This is based on a small number ofsamples and is similar to the naïve scheme but much faster since thesearch range is divided by the regular interval �i .

First, we selected ns = �(360/�i)� samples at regular intervals of�i in the range of hue value (0, 360] and we created a set of selectedhue values = {θ1, θ2, . . . , θns }. Then, the hue value of the suspi-cious image Isus was modified with the values in and we obtaineda hue-modified image set Isus = {Isus(θ1), Isus(θ2), . . . , Isus(θns )}.Applying a denoising filter F to Isus, we obtained a set I

sus =

{nc(θ1), nc(θ2), . . . , nc(θns )} of noise residuals.After that, we calculated the cross correlation between noise

residuals of Isus and Pc from a source camera Isus, and we chose

θ ′ with the largest value as an intermediate result of modified huedegree

θ́ = argmaxθ

( ∑c

corr(nc(θ), Pc)

), where θ ∈ . (4)

Step 2: In this step, we set [Mind , Maxd ] for the candidate range,where Mind ← θ ′ − �i/2 and Maxd ← θ ′ + �i/2. If there is a

global maximum θ̂ in the range [Mind , Maxd ], we can find it usingan algorithm based on a hill-climbing search since the slopes aroundthe maximum point converge toward it.

The algorithm for this step is shown in Algorithm 1. The design ofAlgorithm 1 comes from the concept of a binary search tree with hill-climbing optimization. First, we calculated correlation values for theintermediate angles to the left and right sides of candidates. Second,we narrowed down the search range to the direction of greater valueand reduced the interval by half. These iterations were repeated untilintervals were smaller than ε. The estimated degree θ̂ was calculatedbased on the final candidate range [Mind , Maxd ] as

θ̂ ← mod

(360 −

⌈Maxd +Mind

2

⌉, 360

). (5)

Page 3: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

1828 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 27, NO. 8, AUGUST 2017

Algorithm 1 Global Optimization Algorithm for the ProposedEstimator

Mind ← θ́ −�i/2Maxd ← θ́ +�i/2i tv ← �iε← smallest unit of hue modificationwhile ( i tv > ε ) do

Le f t Estimation←∑c corr(nc (Mind + i tv/4), Pc)

Right Estimation ←∑c corr(nc (Maxd − i tv/4), Pc)

if Lef t Estimation > Right Estimation thenMaxd ← Maxd − i tv/2

elseMind ← Mind + i tv/2

end ifi tv ← i tv/2

end whileReturn θ̂ ← mod(360 − Maxd+Mind

2 , 360)

The estimated degree θ̂ indicates the degree of the modified hue.For example, if θ̂ is zero, the suspicious image Isus does not undergo

any hue manipulation. In contrast, if θ̂ is not zero, the hue of the Isusis modified.

C. Partial Manipulation Detection Algorithm

In this section, we describe how we detected partial manipulationof the hue based on the proposed estimator. To detect hue-modifiedareas, we defined a w×w-pixel sliding window, which moved acrossthe reference pattern P and suspicious image Isus. Using this window,we obtained the block I (i, j) to be investigated and its corresponding

reference pattern P(i, j ) via

P(i, j ) = P[n][m] (6)

Isus(i, j) = Isus[n][m] (7)

where i − (w/2) ≤ n, m ≤ i + (w/2), and [n][m] denotes the imagepixel from the nth row and the mth column. Using the proposedestimator F(P, I ), we obtained an estimation map Mθ of Isus

Mθ (i, j) = F(P(i, j ), Isus(i, j)). (8)

If the inspected image is partially tampered with, the estimationdegree of the tampered region will not have a value of zero.

The estimation result Mθ (i, j) might contain some false-positiveerrors, since regions in which the PRNU is naturally suppressed causefalse positives [6]. Therefore, we needed to check not only the huedegrees but also the correlation values to reduce false positives.

First, we calculate the correlation map Mρ(i, j) using Mθ (i, j)

Mρ(i, j) =∑

ccorr

(η(i, j )c (Mθ (i, j)), P(i, j )

c)

(9)

where c ∈ {r, g, b}, and η(i, j )c (θ) is the noise residual component

of Isus. If pixel (i, j) does not have sufficient noise strength to bedetected, the estimation result Mθ (i, j) will not be accurate andshould be regarded as a false positive. Therefore, we determinedwhether pixel (i, j) was modified using

Z(i, j)={

1, | θ0 − Mθ (i, j)| ≥ τθ , and Mρ(i, j)≥τρ

0, otherwise(10)

where Z(i, j) indicates whether the hue of each pixel is modified,and τθ and τρ are the difference threshold for the hue-modificationdegree and threshold for the correlation, respectively. The operator| · | denotes an absolute value in the range (0, 360] and θ0 indicatesthe unmodified degree value 0.

TABLE I

DIGITAL CAMERA MODELS IN OUR EXPERIMENTS ANDESTIMATION RESULTS FOR EACH CAMERA MODEL

TABLE II

SOURCE IMAGE DESCRIPTION FOR EACH TYPE OF REFERENCE PATTERN

In (10), the value 1 could be replaced with the estima-tion value Mθ (i, j). Specific implementation details for thisalgorithm are described in Section III-C.

III. EXPERIMENTAL RESULTS

For the experiment, we used sample raw images collected from theDresden Image Database [21] for digital image forensics. The restof the raw images were taken directly using the camera models listedin Table I, which also depicts the details of the image database. Allraw images were interpolated by dcraw, the most widely used raw-image decoder, using adaptive homogeneity-directed interpolationalgorithms [22].

For evaluating accuracy, the root mean square error (RMSE) wascalculated as (E((θ̂ − θ)2))1/2), where θ and θ̂ represent the actualand estimated degrees of hue modification, respectively. RMSE servesto aggregate the magnitudes of errors in estimations for variousexperiments. Generally, RMSE values around 3.0 indicate that theestimations are highly accurate (>99.2%).

A. Hue-Modification Estimation Result

1) Reference Pattern: To evaluate our method, we used variouskinds of source images to generate reference patterns. Table II liststhe source image description for each reference pattern. We used notonly flat images (such as blue-sky images) but also general imagestaken in a wide variety of indoor and outdoor scenes to evaluate ourmethod in various scenarios.

Fig. 3(a) illustrates the plot of the RMSE versus the differentnumbers of images Np for each reference pattern. Experiments withRP1 showed the best result even for the case in which Np < 5.The results for RP2 were slightly lower than those for RP1, but RP2still showed good performance. RP3 performed the worst since JPEGcompression damaged the high-frequency component of the targetimage (which included the PRNU component).

2) Evaluation of the Fast Algorithm: To evaluate the fastalgorithm described in Section II-B, the computation time andaccuracy were measured at various values of �i . Tests wereconducted using 200 test images taken with the Nikon D90and used with RP1 (Np = 30). A computer based onIntel i7-3770 (3.40 GHz) CPU with 16-GB main memory was used tomeasure the performance. The hue of the sample images was shiftedrandomly to generate test images.

Fig. 3(b) shows the computation time and accuracy of the fastalgorithm. Note that the result obtained with �i = 1 is the sameas the result obtained with the naïve scheme [14]. When �i was

Page 4: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

HOU AND LEE: DETECTION OF HUE MODIFICATION USING PRNU 1829

Fig. 3. (a) Estimated RMSE of the degree for each reference pattern and thenumber of images Np for each reference pattern. (b) RMSE and computationtime results for the fast algorithm.

increased, computation time was dramatically decreased but estima-tion accuracy was lost. Even so, RMSE values around 3.0 are stillsufficient to detect hue modification.

To analyze the computational complexity, we define ε as theunit size of hue modification. We can describe the size of the huechangeable range as n = 360/ε. Then, the time complexity ofthe naïve scheme is T · O(n), where the symbol T denotes thetotal processing complexity of the noise extraction and correlationcalculation time from the suspicious image Isus. In contrast, the timecomplexity of the fast algorithm is T · O(�(360/�i)� + log(�i)),which is much faster than that of the naïve scheme.

3) Camera Model: In order to examine the influence of the typesof image sensors on the proposed method, we conduct tests with thecamera models described in Table I. Table I depicts the details ofthe estimation results: estimated mean and RMSE. According to theresults, the estimation accuracy was good with D200, D70, D90, andE420. The estimation result for D70s was relatively less good and itsmean was slightly biased, but it still showed high accuracy (around99%).

4) Hue Degree: To investigate the influence of the modified huedegree, the values of RMSE were tested for each hue degree. Forestimation, sample images from the Nikon D90 and reference patternR P1 (Np = 30) were used. The hue of the sample images wasshifted from 0° to 330° in steps of 30° to generate test images. As canbe seen in Table III, the mean of the estimated degree is similar to thedegree of the actual modification. Furthermore, the standard deviationand RMSE do not depend on the degree of hue modification.

5) Block Size: In this experiment, the central regions of thesample images and reference patterns were cropped to various block

TABLE III

ESTIMATED MEAN, STANDARD DEVIATION, ANDRMSE FOR EACH HUE INTERVAL

TABLE IV

RMSE FOR IMAGE SIZE

sizes. Using the cropped block as an input image, we evaluate theperformance of the proposed method. Table IV shows the RMSEfor each block size. The RMSE values increased when the blocksize decreased, because small blocks included less PRNU than didbigger blocks. This result has a relation with the performance of thepartial manipulation detector, because the w×w-size block is directlyassociated with the w × w-pixel sliding window. As the windowsize increased, the accuracy of hue estimation also increased, but thedistinguishing resolution of the forged region became less accurate.

B. Comparison Results for Various Attacks

The estimation results of the proposed method were compared withthose of Choi’s method [2] with an interval factor of �s = 1. Theproposed method was performed with reference pattern type RP2,where Np = 3 and 30.

1) Image Resizing: Fig. 4(a) demonstrates the RMSE values fordifferent image-scaling factors. Reference patterns were preprocessedat the same sizes as the suspicious images before applying ourproposed method, and the input images for Choi’s method wererestored to the original size for a fair comparison. The results ofboth methods are qualitatively similar in the case of the scaling ratio1.0. However, Choi’s method did not work with any scaling ratioapart from the original size. In contrast, the results of the proposedmethod are acceptable with a scaling ratio of 0.5 and higher.

2) JPEG Compression: We compressed the test images by varyingthe JPEG quality factor from 10 to 100 and tested the proposedmethod for estimating hue modification. Fig. 4(b) shows the RMSEvalues for different JPEG compression qualities. The performance ofthe proposed method for images with a JPEG quality factor upwardof 95 was as good as the result for uncompressed images. The resultsin cases with a quality factor between 80 and 95 were also acceptable.On the other hand, even with tests with a high JPEG quality factor(e.g., 98 and 97), Choi’s method [2] performed relatively poorly withrespect to the estimation of hue modification. For quality factors lessthan 95, Choi’s method did not work.

There is an interesting performance observed in Fig. 4(b) whenthe JPEG quality factor grows to 30; the proposed method outputsreverse, increasing for Np = 30 and decreasing for Np = 3,compared with the entire trend in variation. At JPEG quality factorsless than 50, estimation of degrees from some images (especiallydark and highly textured) was completely random since PRNU com-ponents were incompletely extracted from those images. Therefore,we conclude that those outliers made the results assume an odd form.

C. Detection of Partial Manipulation

In this section, we tested the algorithm for detecting partialmanipulation. Using Adobe Photoshop CS6, we modified the hue

Page 5: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

1830 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 27, NO. 8, AUGUST 2017

Fig. 5. Sample images for the partial manipulation detection experiment and its estimation map Mθ (i, j) and detection map Z(i, j). To show the estimatedhue degree, degrees are mapped to the hue of an HSV image where saturation is 1 and brightness is 0.5. In the detection map Z(i, j), forged regions aremapped to the estimated hue value and unmodified regions are represented as black pixels. (a) Image I.1. (b) Image I.2. (c) Mθ (i; j) for I.1. (d) Mθ (i; j)for I.2. (e) Z (i; j) for I.1. (f) Z (i ; j) for I.2. (g) Image II.1. (h) Image II.2. (i) Mθ (i; j) for II.1. (j) Mθ (i; j) for II.2. (k) Z (i; j) for II.1. (l) Z (i; j) for II.2.(m) Image III.1. (n) Image III.2. (o) Mθ (i ; j) for III.1. (p) Mθ (i; j) for III.2. (q) Z (i; j) for III.1. (r) Z (i; j) for III.2.

Fig. 6. 3D histogram of bivariate data [Mθ , Mρ ]. (a) Image I.1: obtained from original image. (b) Image I.2: obtained from a forged image. To visualizethe histogram, the range of the estimated hue was changed to [−180, 180].

of three sample images taken with the Nikon D90. As shown inFig. 5, the hue of the target objects, such as the smartphone, wasmanipulated to distort the meaning of the images.

1) Reduce False-Positive Errors: To reduce the detection errorsdescribed in Section II-C, we modeled the distribution of our esti-mation results using the 2D Gaussian distribution model as

f (x, y) = A exp

(−

((x − μx )2

2σ 2x

+ (y − μy)2

2σ 2y

))(11)

where the x-axis and y-axis indicate the estimated hue degree andthe correlation value, respectively. Here, μx and σx denote the meanand standard deviation of x , and the coefficient A is the amplitudeof the distribution. The distribution of correlation values followsthe generalized Gaussian distribution as discussed elsewhere [6]. Weexperimentally concluded that the distribution of the estimated huedegrees also follows the Gaussian distribution.

Fig. 6 shows the 3D histogram of bivariate data [Mθ , Mρ ] obtainedfrom Images I.1 and I.2. To visualize the histogram, the range of the

estimated hue was changed to [−180, 180]. We intentionally select asmall Np (=5, RP1) to show a bar of false positives in the histogram.We can observe a small number of false positives from Mθ (i, j) > τθin the red circle areas in Fig. 6(a). For the reader’sreference, we also observe false-positive regions at the top ofFig. 5(o).

Generally, these kinds of false positives occurred due to inac-curately extracted PRNU, as discussed in Section II-C. Thus, weadopted the threshold τθ for the estimated hue and the correlationthreshold τρ was defined

τθ = μx − txσx

τρ = μy − tyσy (12)

where tx and ty are the predetermined values used to adjust theprobability of a false-positive error. Thresholds τθ and −τθ areillustrated as red dashed lines, and τρ is illustrated as a red solidline in Fig. 6(b). Therefore, we obtained the reducible false-positive

Page 6: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

HOU AND LEE: DETECTION OF HUE MODIFICATION USING PRNU 1831

Fig. 4. Estimated RMSE of the degree for (a) image resizing and (b) JPEGcompression quality (*: uncompressed).

rate P with τθ and τρ as

P =∫ τρ

−∞

(∫ ∞−∞

f (x, y) dx −∫ τθ

−τθ

f (x, y) dx

)dy. (13)

In this experiment, we used tx , ty = 1.41 to obtainP = 10−3. We note that an inaccurate PRNU causes false negativeswith other methods [5]–[7], [9] as opposed to false positives with ourmethod.

2) Partial Manipulation Detection Results: For theseexperiments, RP1 (Np = 30) reference patterns were used.We selected the 256 × 256 sliding window for its relatively goodestimation performance.

Fig. 5 shows the estimated hue angle Mθ (i, j) and the detectionresult Z(i, j) of the partially manipulated images. To show theestimated hue angle, the estimated degree is mapped to the hueof an HSV image, where saturation is 1 and brightness is 0.5. Inthe detection map Z(i, j), manipulated regions are mapped to theestimated hue value and unmodified regions are represented as blackpixels. We removed all the connected tampered regions from Z , whichcontain fewer than 128 × 128. Using the estimated hue values in thedetection map, we could restore the hue-modified image to an imagesimilar to the original color. For example, we can determine thatthe color of the car in Fig. 5(n) is blue by using the estimated hueillustrated in Fig. 5(r).

The effectiveness of the partial manipulation detection algorithmis guaranteed by the assumption that the image undergoes huemodification and that the PRNU noise is not distorted by other forms

Fig. 7. ROC curves of the proposed method, Choi’s [2], and Chen’s method[6] methods. Results are compared with various attacks. (a) No compression.(b) JPEG 95. (c) Scaling 0.9. (d) Scaling 0.5/JPEG 95.

of image manipulation. Theoretically, any invariant image featurecould be used to detect the relevant image modifications; however,practically, these are infeasible because invariant features such asPRNU noise may be distorted by other unknown manipulations.Even if we know the way and the order of image modifications,the complexity of detection will increase dramatically. Therefore,this issue should be resolved in combination with other PRNU-based methods such as [6] and [8]. The PRNU-distorted regionscan be detected by other PRNU-based methods, since we alreadyhave a reference pattern for our detector. On the other hand, ourmethod regards these regions as false positives with reference to thethreshold τρ and (13).

To enhance the performance of the local manipulation detec-tor, we could adopt the following methods to improve the qual-ity of PRNU noise in our scheme. First, Li [23] proposed anapproach to attenuating the influence of details from scenes onPRNUs. In addition, Li and Satta [9] studied the potential cor-relation between the quality of PRNU noise and the vignettingeffect. Lin and Li [24] proposed a preprocessing approach to atten-uating the influence of the nonunique artifacts on the referencepattern to reduce the false-identification rate. Considering theseaspects of PRNU noise, we could improve our partial manipulationdetector.

3) Comparison Results for Various Attacks: The results of theproposed method were compared with those of Choi et al. [2](�s = 1, 32 × 32 block size) and Chen et al. [6] (widow size =256 × 256). Reference pattern RP1 (Np = 30) was used for theproposed method and by Chen et al. [6]. For the method of Chen etal. [6], the estimated reference pattern and noise residual extractedfrom each color channel were combined into a grayscale image usingthe linear combination described earlier.

Fig. 7 reports receiver operating characteristic (ROC) curves ofthe proposed method, Choi’s method [2], and Chen’s method [6].We used the test images shown in Fig. 5 with JPEG compression andscaling. Choi’s method performed better than the proposed methodwith uncompressed images [Fig. 7(a)]. However, it lost accuracy after

Page 7: Letter - KAISThklee.kaist.ac.kr/publications/IEEE Trans. on... · Letter Detection of Hue Modification Using Photo Response Nonuniformity Jong-Uk Hou, Student Member, IEEE, and Heung-Kyu

1832 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 27, NO. 8, AUGUST 2017

JPEG compression and it completely malfunctioned in any case withresized images, even when the scaling ratio was around 1.0. Chen’smethod did not detect hue modification in all cases.

IV. CONCLUSION

In this paper, we discussed a family of PRNU-based image manip-ulation detectors, which are well able to localize hue-modificationregions and estimate the modified degree even after arbitrary commonimage processing. Using separated PRNU of an image, we designeda forensic scheme for estimating the degree of hue modification. Wealso proposed an efficient estimation scheme and an algorithm forthe detection of partial manipulation. This method achieved robusthue forgery detection resistance to effects caused by common imageprocessing, which was not achieved in a previous forgery detector[2], [6]. We plan to extend the method to estimation of other types ofimage property modification such as white balancing and saturation.

REFERENCES

[1] A. C. Popescu and H. Farid, “Exposing digital forgeries by detectingtraces of resampling,” IEEE Trans. Signal Process., vol. 53, no. 2,pp. 758–767, Feb. 2005.

[2] C.-H. Choi, H.-Y. Lee, and H.-K. Lee, “Estimation of color modificationin digital images by CFA pattern change,” Forensic Sci. Int., vol. 226,pp. 94–105, Mar. 2013.

[3] H. Farid, “Exposing digital forgeries from JPEG ghosts,” IEEE Trans.Inf. Forensics Security, vol. 4, no. 1, pp. 154–160, Mar. 2009.

[4] M. K. Johnson and H. Farid, “Exposing digital forgeries in complexlighting environments,” IEEE Trans. Inf. Forensics Security, vol. 2, no. 3,pp. 450–461, Sep. 2007.

[5] J. Lukáš, J. Fridrich, and M. Goljan, “Detecting digital image forg-eries using sensor pattern noise,” Proc. SPIE, vol. 6072, p. 60720Y,Feb. 2006.

[6] M. Chen, J. Fridrich, M. Goljan, and J. Lukáš, “Determining imageorigin and integrity using sensor noise,” IEEE Trans. Inf. ForensicsSecurity, vol. 3, no. 1, pp. 74–90, Mar. 2008.

[7] G. Chierchia, G. Poggi, C. Sansone, and L. Verdoliva, “A Bayesian-MRF approach for PRNU-based image forgery detection,” IEEE Trans.Inf. Forensics Security, vol. 9, no. 4, pp. 554–567, Apr. 2014.

[8] G. Chierchia, S. Parrilli, G. Poggi, C. Sansone, and L. Verdoliva, “On theinfluence of denoising in PRNU based forgery detection,” in Proc. 2ndACM Workshop Multimedia Forensics, Secur. Intell. (MiFor), New York,NY, USA, 2010, pp. 117–122.

[9] C.-T. Li and R. Satta, “On the location-dependent quality of the sensorpattern noise and its implication in multimedia forensics,” in Proc. 4thInt. Conf. Imag. Crime Detect. Prevention (ICDP), Nov. 2011, pp. 1–6.

[10] X. Pan and S. Lyu, “Region duplication detection using image fea-ture matching,” IEEE Trans. Inf. Forensics Security, vol. 5, no. 4,pp. 857–867, Dec. 2010.

[11] V. Christlein, C. Riess, J. Jordan, C. Riess, and E. Angelopoulou,“An evaluation of popular copy-move forgery detection approaches,”IEEE Trans. Inf. Forensics Security, vol. 7, no. 6, pp. 1841–1854,Dec. 2012.

[12] S.-J. Ryu, M. Kirchner, M.-J. Lee, and H.-K. Lee, “Rotation invariantlocalization of duplicated image regions based on Zernike moments,”IEEE Trans. Inf. Forensics Security, vol. 8, no. 8, pp. 1355–1370,Aug. 2013.

[13] D. Cozzolino, G. Poggi, and L. Verdoliva, “Efficient dense-field copy–move forgery detection,” IEEE Trans. Inf. Forensics Security, vol. 10,no. 11, pp. 2284–2297, Nov. 2015.

[14] J.-U. Hou, H.-U. Jang, and H.-K. Lee, “Hue modification estima-tion using sensor pattern noise,” in Proc. IEEE Int. Conf. ImageProcess. (ICIP), Oct. 2014, pp. 5287–5291.

[15] C.-H. Choi, J.-H. Choi, and H.-K. Lee, “CFA pattern identification ofdigital cameras using intermediate value counting,” in Proc. 13th ACMMultimedia Workshop Multimedia Secur. (MM&Sec), New York, NY,USA, 2011, pp. 21–26.

[16] B. E. Bayer, “Color imaging array,” U.S. Patent 3 971 065, Jul. 20, 1976.[17] J. Nakamura, Image Sensors and Signal Processing for Digital Still

Cameras. Boca Raton, FL, USA: CRC, 2005.[18] J. Lukáš, J. Fridrich, and M. Goljan, “Digital camera identification from

sensor pattern noise,” IEEE Trans. Inf. Forensics Security, vol. 1, no. 2,pp. 205–214, Jun. 2006.

[19] C.-T. Li and Y. Li, “Color-decoupled photo response non-uniformityfor digital image forensics,” IEEE Trans. Circuits Syst. Video Technol.,vol. 22, no. 2, pp. 260–271, Feb. 2012.

[20] C. J. Taylor and D. J. Kriegman, “Minimization on the Lie Group SO(3)and related manifolds,” Dept. Elect. Eng., Yale Univ., New Haven, CT,USA, Tech. Rep. 9405, 1994.

[21] T. Gloe, A. Winkler, and K. Borowka, “Efficient estimation and large-scale evaluation of lateral chromatic aberration for digital image foren-sics,” Proc. SPIE, vol. 7541, p. 754107, Jan. 2010.

[22] C.-K. Lin. (Apr. 2003). Pixel Grouping for Color Filter Array Demo-saicing. [Online]. Available: http://sites.google.com/site/chklin/demosaic

[23] C.-T. Li, “Source camera identification using enhanced sensor patternnoise,” IEEE Trans. Inf. Forensics Security, vol. 5, no. 2, pp. 280–287,Jun. 2010.

[24] X. Lin and C.-T. Li, “Preprocessing reference sensor pattern noise viaspectrum equalization,” IEEE Trans. Inf. Forensics Security, vol. 11,no. 1, pp. 126–140, Jan. 2016.