+ All Categories
Home > Documents > Sensors Fingerprinting

Sensors Fingerprinting

Date post: 11-May-2017
Category:
Upload: charan-macharlas
View: 229 times
Download: 1 times
Share this document with a friend
54
1 IEEE-WVU, Anchorage - 2008 Sensor Fingerprinting: Challenges and Trends Walter J. Scheirer [email protected] VAST Lab University of Colorado at Colorado Springs
Transcript
Page 1: Sensors Fingerprinting

1IEEE-WVU, Anchorage - 2008

Sensor Fingerprinting: Challenges and Trends

Walter J. [email protected]

VAST LabUniversity of Colorado at Colorado Springs

Page 2: Sensors Fingerprinting

2IEEE-WVU, Anchorage - 2008

What is Sensor Fingerprinting?

• Identifying the data acquisition device that generated a given image– Device Class Identification– Specific Device Identification

Page 3: Sensors Fingerprinting

3IEEE-WVU, Anchorage - 2008

Why would we want to do this?

• Digital Image Forensics“Over 90% of our of computer

forensics cases involve child pornography”- unnamed detective from the Pennsylvania State Police

• The Burden of digital photography– Is an image real, or has it been rendered by

a computer?– Is an image an original, or has it been

altered by digital image processing?

??

?

??

?

? ?

QuickTime™ and a decompressor

are needed to see this picture.

Page 4: Sensors Fingerprinting

4IEEE-WVU, Anchorage - 2008

Ashcroft v. Free Speech Coalition

• Supreme Court Ruling:– Struck down two provisions of the Child Pornography

Prevention Act of 1996 – Synthetic images depicting what would be considered illegal

in a genuine image are legal, and protected by the first amendment

• How does a forensic investigator determine what is real, and what is synthetic?

*http://www.law.cornell.edu/supct/html/00-795.ZO.html

Page 5: Sensors Fingerprinting

5IEEE-WVU, Anchorage - 2008

Digital Image Forensics

A three stage process:1. Image Source Identification (Sensor

Fingerprinting)2. Discrimination of Synthetic Images from

Real Images3. Image Forgery Detection

Page 6: Sensors Fingerprinting

6IEEE-WVU, Anchorage - 2008

Digital Image Forensics• If an image is indeed real…

– What can we determine about the nature of the source?

– Can we pinpoint the exact model of the device?

– Can we prove beyond a reasonable doubt that a certain device produced the image?

Evidence!

Page 7: Sensors Fingerprinting

7IEEE-WVU, Anchorage - 2008

Vision of the Unseen

• Sensor fingerprinting relies on underlying characteristics of sensors and image processing techniques used by digital cameras– Artifacts, distortions, statistical properties

• These characteristics are nearly always imperceptible to the human eye

QuickTime™ and a decompressor

are needed to see this picture.QuickTime™ and a

decompressorare needed to see this picture.

Page 8: Sensors Fingerprinting

8IEEE-WVU, Anchorage - 2008

Image Acquisition Pipeline

Lens Filter(s) Color Filter Array

Sensor Camera Processing

Page 9: Sensors Fingerprinting

9IEEE-WVU, Anchorage - 2008

Sensors

• CCD & CMOS• Tremendous variation

between individual sensors

• Defects caused by manufacturing or use

Page 10: Sensors Fingerprinting

10IEEE-WVU, Anchorage - 2008

Color Filter Arrays• RGB: Bayer Pattern

• CMYK

Page 11: Sensors Fingerprinting

11IEEE-WVU, Anchorage - 2008

Sensor Fingerprinting: Source Model Identification

• Digital Cameras– Lens– Size of sensor– Choice of CFA– Demosaicing algorithm– Color processing algorithm

• Many manufacturers use the same components

Page 12: Sensors Fingerprinting

12IEEE-WVU, Anchorage - 2008

Image Features

• Revisiting some of the techniques of steganalysis: Kharrazi et al. 2004*– Defines a set of 34 features inspired by

universal steganlaysis techniques– Color features, wavelet coefficient

statistics, image quality metrics

*M. Kharrazi, H. T. Sencar, and N. Memon, Blind Source Camera Identification, Proc. of IEEE ICIP (2004)

Page 13: Sensors Fingerprinting

13IEEE-WVU, Anchorage - 2008

Image Color Features

• Average pixel value– Gray world assumption: average values in RGB

channels of an image should average to gray• RGB pairs correlation

– Variance in correlation of RG, RB, and GB pairs across sensor manufacturers

• Neighbor distribution center of mass– Calculate the number of pixel neighbors for each pixel

value– Distribution gives an indication of the sensitivity of the

camera to different intensity levels

Page 14: Sensors Fingerprinting

14IEEE-WVU, Anchorage - 2008

Image Color Features

• RGB Pairs Energy Ratio

• Wavelet Domain Statistics– Decompose each color band using

separable quadric mirror filters*– Calculate the mean for each sub-band

|G|2E1 = |B|2

|G|2E2 = |R|2

|B|2E1 = |R|2

*H. Farid and S. Lyu, “Detecting hidden messages using higher-order statistics and support vector machines, 5th International Workshop on Information Hiding. 2002

Page 15: Sensors Fingerprinting

15IEEE-WVU, Anchorage - 2008

Image Quality Metrics• Pixel difference based measures

– Mean square error, mean absolute error, modified infinity norm

• Correlation based measures– Normalized cross correlation, Czekanowski

correlation• Spectral distance based measure

– Spectral phase and magnitude errors*I. Avcibas, N. Memon, and B. Sankur, Steganalysis using image quality metrics. IEEE transactions on Image Processing, January 2003.

Page 16: Sensors Fingerprinting

16IEEE-WVU, Anchorage - 2008

Classification• Construct feature vector

out of the calculated features for a given image

• Build training sets for each class of camera

• Use machine learning for classification of images with unknown sources

SVM Hyperplane

Camera 1

Camera 2

Page 17: Sensors Fingerprinting

17IEEE-WVU, Anchorage - 2008

Results of Kharrazi et al.

• Standard two class SVM

Nikon SonyNikon 99.88 0.12Sony 2.4 97.6

Nikon SonyNikon 96.08 3.91Sony 9.25 90.74

Serious Quantization effects observed, driving up classification performance

Images recompressed to a JPEG quality level of “75”

Page 18: Sensors Fingerprinting

18IEEE-WVU, Anchorage - 2008

Results of Kharrazi et al.• Multi-class SVM

Nikon Sony Canon (S110)

Canon (S100)

Canon (S200)

Nikon 89.67 0.22 4.77 1.64 3.7

Sony 3.56 95.24 0.31 0.34 0.53

Canon (S110)

7.85 0.6 78.71 4.78 8.04

Canon (S100)

3.14 0.32 3.57 92.84 0.11

Canon (S200)

5.96 2.27 7.88 0.23 83.63

* See also M.J. Tsai and G.H. Wu, Using Image Features to Identify Camera Sources, Proc. of IEEE ICASSP (2006).

Page 19: Sensors Fingerprinting

19IEEE-WVU, Anchorage - 2008

CFA and Demosaicing Artifacts• Choice of CFA

• Demosaicing

Bayer Pattern (RGB) CMYK

Page 20: Sensors Fingerprinting

20IEEE-WVU, Anchorage - 2008

Image Features from CFA - Cell Phone Cameras

• Çeliktutan et al. 2005*• Motivation: proprietary interpolation algorithms leave

correlations across adjacent bit planes of the images• Binary Similarity Measures (also used in

steganalysis)• Define a stencil function:

nc (k,b) ={1 if xc = 0 xn = 0

2 if xc = 0 xn = 1 3 if xc = 1 xn = 0 4 if xc = 1 xn = 1

*O. Celiktutan, I. Avcibas, B. Sankur and N. Memon, Source Cell-Phone Identification, Proc. of ADCOM (2005).

Page 21: Sensors Fingerprinting

21IEEE-WVU, Anchorage - 2008

Image Features - Cell Phone Cameras

• Sum over its four neighbors, and over all MxN pixels

• Normalize the agreement scores:

• Define binary Kullback Leibler distance (feature 1):

nc (k,b)

b

k = (k,b) / (k,b)k

p

m1 = - p log 7

n

7n

p

p8n

4

n = 1

Page 22: Sensors Fingerprinting

22IEEE-WVU, Anchorage - 2008

Image Features - Cell Phone Cameras

• Define neighborhood weighting mask (feature 2):

S = xi 2i

7

i=0

1 2 4128 256 864 32 16

Score Function Weighting Pattern of the Neighbors

m2 = | S - S |

511

n=0

7 n

8 n

Absolute Histogram Difference

Page 23: Sensors Fingerprinting

23IEEE-WVU, Anchorage - 2008

Image Features - Cell Phone Cameras

• Czenakowski distance - feature 3• Classification

– KNN and multi-class SVM

Scatter plot of three features for three different cameras from Çeliktutan et al. 2005

Page 24: Sensors Fingerprinting

24IEEE-WVU, Anchorage - 2008

Results of Çeliktutan et al. 2005

• KNN classification

Sony K700

Motorolla V3

Nokia 6230

Sony K700

100 0 0

Motorolla V3

0 100 0

Nokia 6230

0 4 96

Sony K750

Motorolla V3

Nokia 7270

Sony K700

71 6 23

Motorolla V3

1 97 2

Nokia 7270

18 6 76

Overall performance = 98.7% Overall performance = 81.3%

Page 25: Sensors Fingerprinting

25IEEE-WVU, Anchorage - 2008

Results of Çeliktutan et al. 2005

• Multi-class SVM classification

• Overall accuracy: 62.3%

• Random guessing: 11.1%

S1 S2 M1 M2 N1 N2 N3 N4 L1

S1 92 4 0 0 0 0 0 3 0

S2 5 63 1 0 4 0 0 32 3

M1 0 3 60 11 3 1 3 3 5

M2 0 0 22 67 4 9 5 0 5

N1 0 6 2 3 57 3 6 7 18

N2 0 1 2 6 3 68 22 0 0

N3 0 1 7 10 1 18 62 1 1

N4 3 16 2 0 7 0 0 36 12

L1 0 6 4 3 21 1 2 18 56

Page 26: Sensors Fingerprinting

26IEEE-WVU, Anchorage - 2008

The Expectation/Maximization Algorithm

• Popescu 2005*• Motivating assumption: rows and columns of

interpolated images are likely to be correlated with their neighbors

• Two steps:– Expectation: the probability of each sample

belonging to each model is estimated– Maximization: the specific form of the correlations

between samples is estimated– Both steps are iterated till convergence

*A. Popescu, Statistical Tools for Digital Image Forensics, Ph.D. Dissertation, Department of Computer Science, Darthmouth College (2005).

Page 27: Sensors Fingerprinting

27IEEE-WVU, Anchorage - 2008

The Expectation/Maximization Algorithm

• Assume that each sample belongs to one of two models– M1, if a sample is linearly correlated with its

neighbors

– M2, if a sample is not correlated with its neighbors

f(x,y) = u,v f(x+u, y+v) + n(x,y)

N

u,v=-N

Page 28: Sensors Fingerprinting

28IEEE-WVU, Anchorage - 2008

The Expectation/Maximization Algorithm

• Expectation Step– The probability of each sample belonging to M1 is

estimated using Bayes’ rule:

Pr{f(x,y) M1 | f(x,y)} =

Pr{ f(x,y) | f(x,y) M1} Pr{f(x,y) M1}

Pr{ f(x,y) | f(x,y) Mi} Pr{ f(x,y) Mi}2

i=1

Page 29: Sensors Fingerprinting

29IEEE-WVU, Anchorage - 2008

The Expectation/Maximization Algorithm

• The probability of observing a sample knowing it was generated by a model M1:

Pr{f(x,y) | f(x,y) M1} =

1 2

exp[ 1- 22 (f(x,y) - u,v f(x+u, y+v))2]N

u,v=-N

Page 30: Sensors Fingerprinting

30IEEE-WVU, Anchorage - 2008

The Expectation/Maximization Algorithm

• The Maximization step– Generate a new estimate of using

weighted least squares:

– Execute both steps until a stable is achieved. Final result maximizes the likelihood of observed samples

E() = w(x,y) f(x,y) - u,vf(x+u, y+v) ( )2

x,y

N

u,v=-N

Page 31: Sensors Fingerprinting

31IEEE-WVU, Anchorage - 2008

Demosaicing Algorithms*

QuickTime™ and a decompressor

are needed to see this picture.

Image p F(p)

bilinear

bi-cubic

smooth line

*Popescu 2005

Page 32: Sensors Fingerprinting

32IEEE-WVU, Anchorage - 2008

Demosaicing Algorithms*

QuickTime™ and a decompressor

are needed to see this picture.

QuickTime™ and a decompressor

are needed to see this picture.

Image p F(p)

Median 5x5

gradiant

Median 3x3

*Popescu 2005

Page 33: Sensors Fingerprinting

33IEEE-WVU, Anchorage - 2008

Demoasicing Algorithms*

QuickTime™ and a decompressor

are needed to see this picture.

Image p F(p)

adaptive color plane

Variable number of gradiants

No CFA interpolation

*Popescu 2005

Page 34: Sensors Fingerprinting

34IEEE-WVU, Anchorage - 2008

The Expectation/Maximization Algorithm

• Results of Popescu 2005– Average accuracy over

all pairs of algorithms tested was 97%

– Minimum testing accuracy was 87% (3x3 median filter vs. variable number of gradients)

Estimated interpolation coef. from 100 images CFA for 8 different algorithms projected on a 2D space

Page 35: Sensors Fingerprinting

35IEEE-WVU, Anchorage - 2008

EM Algorithm for Camera Detection

• Bayram et al. 2005*– Applies EM algorithm to identify 3 different

cameras, classification via SVM

nikon sony

nikon 95.71 4.29

sony 17.14 82.86

nikon sony

nikon 91.43 8.57

sony 5.71 94.29

nikon sony

nikon 94.64 5.36

sony 3.57 96.43

predictedActual 3x3 Interpolation kernel 4x4 Interpolation kernel 5x5 Interpolation kernel

*S. Bayram, H. Sencar and N. Memon, “Source Camera Identification Based on CFA interpolation”, Proc. of the IEEE ICIP (2005)

Page 36: Sensors Fingerprinting

36IEEE-WVU, Anchorage - 2008

EM Algorithm for Camera Detection• Bayram et al. 2005

– Multiclass SVM

Nikon Sony Canon

Nikon 85.71 10.71 3.57

Sony 10.71 75 14.28

Canon 0 10.71 89.28

Actual

predicted

5x5 interpolation kernel

Page 37: Sensors Fingerprinting

37IEEE-WVU, Anchorage - 2008

Enhancements to the EM Approach

• Bayram et al. 2006*• Better detection of interpolation artifacts in

smooth images– Low-order interpolation introduces periodicity in

the variance of the second derivative of an interpolated signal

QuickTime™ and a decompressor

are needed to see this picture.QuickTime™ and a

decompressorare needed to see this picture.

QuickTime™ and a decompressor

are needed to see this picture.QuickTime™ and a

decompressorare needed to see this picture.

*S. Bayram, H.T. Sencar and N. Memon, “Improvements on Source Camera-Model Identiciation Based on CFA Interpolation,” Proc. of WG 11.9 Int. Conf. on Digital Forensics (2006).

Page 38: Sensors Fingerprinting

38IEEE-WVU, Anchorage - 2008

Enhancements to the EM Approach• Results of Bayram et al. 2006

Nikon Sony Canon

Nikon 85.71 10.71 3.57

Sony 10.71 75 14.28

Canon 0 10.71 89.28

Nikon Sony Canon

Nikon 76.78 8.92 14.28

Sony 12.5 76.78 10.71

Canon 19.64 10.71 69.64

Nikon Sony Canon

Nikon 94.78 1.50 3.72

Sony 2.08 95.28 2.64

Canon 0 2.26 97.74

Actual

predicted

5x5 interpolation kernel Periodicity in the second order derivative

Combined set of features

Page 39: Sensors Fingerprinting

39IEEE-WVU, Anchorage - 2008

Enhancements to the EM Approach

• Long et al. 2006– Use modeling error, instead of interpolation filter

coefficients• Swaminathan et al. 2006

– Assumes a CFA pattern, discriminating between the interpolated and un-interpolated pixel locations and values

– Estimate interpolation filter coefficients corresponding to the pattern

– Compute error between an image of newly interpolated “interpolated” pixels, and the actual image

Page 40: Sensors Fingerprinting

40IEEE-WVU, Anchorage - 2008

Lens Distortions

• Compensation for radial distortion induces unique artifacts in the images

• Choi et al. introduces a second order radial symmetric distortion model– Model parameters are used as classification features– Accuracy ~91%

Radial distortion Rectified Image

Page 41: Sensors Fingerprinting

41IEEE-WVU, Anchorage - 2008

Sensor Fingerprinting: Individual Source Identification• Need more detail beyond what we’ve looked at

so far with Source Model Identification– Hardware and component imperfections, defects,

and faults– Effects of manufacturing process, environment,

operating conditions– Aberrations produced by a lens, noisy sensor, dust

on the lens• Artifacts may be temporal!

Page 42: Sensors Fingerprinting

42IEEE-WVU, Anchorage - 2008

Imaging Sensor Imperfections

• Early work: Kurosawa et al. 1999– Detect fixed pattern noise caused by dark

current in digital video cameras– Dark current: the rate that electrons

accumulate in each pixel due to thermal action

Accumulation of dark current on CCD (dark frame brightened 37 times for viewing)

Image credit: http://www.diaginc.com/techforum/imagecorrections.shtml

*K. Kurosawa, K. Kuroki and N. Saitoh, “CCD Fingerprint Method”, Proc. of the IEEE ICIP (1999)

Page 43: Sensors Fingerprinting

43IEEE-WVU, Anchorage - 2008

Imaging Sensor Imperfections

• Kurosawa et al. 1999

QuickTime™ and a decompressor

are needed to see this picture.QuickTime™ and a

decompressorare needed to see this picture.

QuickTime™ and a decompressor

are needed to see this picture.QuickTime™ and a

decompressorare needed to see this picture.

DCR-VX1000

80642 30821 72567 49967

Serial Numbers

Page 44: Sensors Fingerprinting

44IEEE-WVU, Anchorage - 2008

Imaging Sensor Imperfections• Geradts et al. 2001*

– Detect “hot pixels”, cold/dead pixels, pixel traps, and cluster defects

QuickTime™ and a decompressor

are needed to see this picture.

Same Camera Model, Averaged Blank Images

Camera A Camera B

*Z.J. Geradts, J. Bijhold, M. Kieft, K. Kurusawa, K. Kuroki, and N Saitoh, “Methods for Identification of Images Acquired with Digital Cameras”, Proc. of SPIE, vol. 4232 (2001)

Page 45: Sensors Fingerprinting

45IEEE-WVU, Anchorage - 2008

Imaging Sensor Imperfections• Effects of temperature on pixel response

• Pixel defects in a real image

QuickTime™ and a decompressor

are needed to see this picture.

QuickTime™ and a decompressor

are needed to see this picture.

0º C 20º C 40º C

Page 46: Sensors Fingerprinting

46IEEE-WVU, Anchorage - 2008

Imaging Sensor Imperfections• No quantitative analysis of previous two

methods• Lukas et al. 2006*: Formal quantification and

analysis of sensor noise for identificationPattern noise

Fixed pattern noise Photo-response non-uniformity noise

Pixel non-uniformity Low-frequency defects

*J. Lukas, J. Fridrich and M. Goljan, “Digital Camera Identification From Sensor Pattern Noise,” IEEE Trans. On Inf. Forensics and Security, vol. 1, no. 2, pp. 205-214.

Page 47: Sensors Fingerprinting

47IEEE-WVU, Anchorage - 2008

Pixel Non-Uniformity Noise• The signal r exhibits

properties of a white noise signal with an attenuated high frequency band– Attenuation is likely due to

the low-pass character of the CFA interpolation algorithm

• PNU noise cannot be found in saturated pixels (255)

QuickTime™ and a decompressor

are needed to see this picture.

Magnitude of Fourier transform of one row in an image obtained as average

over 118 images of a flat seen

Page 48: Sensors Fingerprinting

48IEEE-WVU, Anchorage - 2008

PNU Camera Identification Algorithm

• Establish a camera reference pattern Pc, which is an approximation to the PNU noise

• An approximation of PNU noise P(k):(P1 + P2 + … + PN) / N

• Optimization: suppress scene content by applying denoising filter F, and averaging noise residuals n(k): n(k) = P(k) - F(P(k))

Page 49: Sensors Fingerprinting

49IEEE-WVU, Anchorage - 2008

PNU Camera Identification Algorithm

• Calculate the correlation C between the noise residual n = p - F(p) and the camera reference pattern PC:

C(p) = (n - n) (PC - PC)||n - n|| ||PC - PC||

Page 50: Sensors Fingerprinting

50IEEE-WVU, Anchorage - 2008

PNU Identification: Results of Lukas et al. 2006

QuickTime™ and a decompressor

are needed to see this picture.

Distribution of the correlation of the reference pattern from Canon S40 with noise residual from 300 Olympus C765 images

QuickTime™ and a decompressor

are needed to see this picture.

Distribution of the correlation of the reference pattern from Nikon D100 with noise residual from 8x300 images from all other cameras

Page 51: Sensors Fingerprinting

51IEEE-WVU, Anchorage - 2008

PNU Identification: Results of Lukas et al. 2006

t FRR

Nikon 0.0449 4.68x10-3

C765-1 0.0170 3.79x10-4

C765-2 0.0080 5.75x10-11

G2 0.0297 2.31x10-4

S40 0.0322 1.42x10-4

Sigma 0.0063 2.73x10-4

Kodak 0.0997 1.14x10-11

C3030 0.0209 1.87x10-3

A10 0.0166 7.59x10-5

Decision threshold t and FRR for FAR = 10-3

Page 52: Sensors Fingerprinting

52IEEE-WVU, Anchorage - 2008

PNU Identification: Results of Lukas et al. 2006

QuickTime™ and a decompressor

are needed to see this picture.

Nikon D100 with noise residual from approximately 6x300 JPEG compressed images with quality factor 90 from 6 other cameras

QuickTime™ and a decompressor

are needed to see this picture.

Correlation of noise residuals from 84 Cannon G2 1600x1200 JPEG images with 6 reference patterns

Page 53: Sensors Fingerprinting

53IEEE-WVU, Anchorage - 2008

Application of PNU Camera Identification

• Sutcu et al. 2007*– Fuse pattern noise properties with

demosaicing characteristics

Pattern noise match

yes

no

Classifier based on interpolation coefficients

Decision: image was not taken by this camera

no

yes Decision: image was taken by this camera

*Y. Sutcu, S. Bayram, H.T. Sencar and N. Memon, “Improvements on Sensor Noise Based Camera Identification,” Proc. of IEEE ICME (2007).

Page 54: Sensors Fingerprinting

54IEEE-WVU, Anchorage - 2008

Application of PNU Camera Identification

QuickTime™ and a decompressor

are needed to see this picture.QuickTime™ and a

decompressorare needed to see this picture.

QuickTime™ and a decompressor

are needed to see this picture.


Recommended