08 - image compression 1 - uni-hamburg.deseppke... · 2015-11-09 · min-fakultät fachbereich...

12
MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1 Lecture 8 – Image Compression 1 Winter Semester 2014/15 Slides: Prof. Bernd Neumann Slightly revised by: Dr. Benjamin Seppke & Prof. Siegfried Stiehl

Upload: others

Post on 23-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

MIN-FakultätFachbereich InformatikArbeitsbereich SAV/BV (KOGS)

ImageProcessing1(IP1)Bildverarbeitung1

Lecture 8 – ImageCompression 1

WinterSemester2014/15

Slides:Prof.BerndNeumannSlightly revised by:Dr.BenjaminSeppke&Prof.SiegfriedStiehl

Page 2: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

ImageDataCompressionDatacompression allows to savestorage space as well as transmissiancapacity for digital/discrete data.Therefore,the datawillbe rearranged inanew structurewith needs less storage than the former one.

Imagedata compression is important for:• Imagearchives e.g largesatellite images• Imagetransmission e.g.over the internet• Multimediaapplications e.g.desktop editing

Imagedata compression exploits redundancy formore efficient coding:

2

IP1– Lecture 8:Imagecompression 1

Transmission,storage,archiving

data redundancy reductionDigitalimage

reconstruction

12.11.15 University of Hamburg, Dept. Informatics

Page 3: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

DifferentKindsof CompressionLossless compression:• Originaldata may be reconstructed exactly from the

compressed data.• Examples:

– Filecompression,e.g.ZIP,GZetc.– Imageformats:TIFFand PNG

Lossy compression:• Originaldatenmay be reconstructed approximately from the

compressed data.• Examples:

– JPEGimage files– DVDsand Blu-Rays– MP3audio files

12.11.15 University of Hamburg, Dept. Informatics 3

IP1– Lecture 8:Imagecompression 1

Page 4: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

RunLength Coding

Imageswith repeatinggreyvalues along rows (or columns)can becompressed by storing "runs"of identical greyvalues inthe format:

For B/Wimages (e.g.faxdata)another run length code is used:

4

IP1– Lecture 8:Imagecompression 1

greyvalue1 repetition1 greyvalue2 repetition2 •••

row # column#run1begin

column#run1end

column#run2begin

column#run2end •••

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 150

123

Runlength code:

(03599)(11799)(344668810101214)

12.11.15 University of Hamburg, Dept. Informatics

Page 5: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Probabilistic DataCompressionAdiscrete image encodes information redundantly if

1.the greyvalues of individualpixels are notequally probable2.the greyvalues of neighbouringpixels are correlated

InformationTheory provides limits forminimalencodingof probabilisticinformationsources.Redundancy of the encodingof individualpixelswith Ggreylevels each:

5

IP1– Lecture 8:Imagecompression 1

H = P(g) log2

1P(g)g=0

G−1∑

numberof bits used for each pixelb = log2G!" #$=

Theentropy of apixel sourcewith equally probablegreyvalues is equal to thenumberof bits required for coding.

r = b - H

12.11.15 University of Hamburg, Dept. Informatics

H = entropy of pixel source= mean numberof bits required to encode

information of this source

Page 6: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Huffman Coding I

TheHuffman codingscheme provides avariable-length codewith minimalaverage code-word length,i.e.leastpossibleredundancy,for adiscretemessage source.(Here messages are greyvalues)

Algorithm:1. Sort messages along increasingprobabilitiessuchthat g(1) and g(2) are

the leastprobablemessages2. Assign 1to codeword of g(1) and 0to the code word of g(2).3. Merge g(1) undg(2) to one newmessage by adding their probabilities.4. Repeatsteps 1– 4until asinglemessage is left.

IP1– Lecture 8:Imagecompression 1

12.11.15 University of Hamburg, Dept. Informatics 6

Page 7: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Huffman Coding II

Example:

ResultierendeCodierungen:

IP1– Lecture 8:Imagecompression 1

Message Probabilityg(5) 0.30g(4) 0.25g(3) 0.25g(2) 0.10g(1) 0.10

0

10.55

0

10.20

0

1

0.45

0

1

Message Probability Coding

g(5) 0.30 00g(4) 0.25 01g(3) 0.25 10g(2) 0.10 110g(1) 0.10 111

Entropy:H= 2.185

Mean codeword length:2.2

12.11.15 University of Hamburg, Dept. Informatics 7

Page 8: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

StatisticalDependenceAnimage may be modelled as aset of statistically dependent random variableswith amultivariatedistribution

Often the exact distribution is unknown and only correlations can be (approximately)determined.

Correlation of two variables: Covariance of two variables:

Attention:Uncorrelated variablesneed notbe statistically independent:

But:For Gaussian random variables,uncorrelatedness implies statistical independence.

IP1– Lecture 8:Imagecompression 1

Correlation matrix

E xix j!" #$= cij

ExxT!

"#$=

c11 c12 c13

c21 c22 c23

c31 c32 c33

!

"

%%%%%

#

$

&&&&&

E (xi −µi )(x j −µ j )"# $%= vij with µk =mean of xk

E (!x − !µ)(!x − !µ)T"# $%=

v11 v12 v13 !

v21 v22 v23 !

v31 v32 v33 !

" " " #

"

#

&&&&&

$

%

'''''

Covariance matrix

E xix j!" #$= 0 ⇒ p xix j( ) = p xi( ) ⋅ p x j( )

p x( ) = p x1,x2 ,…,xN( )

12.11.15 University of Hamburg, Dept. Informatics 8

Page 9: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Karhunen-Loève TransformDetermine uncorrelated variables fromcorrelated variables by alineartransformation.

• AnorthonormalmatrixAwhich diagonalizesthe realsymmetriccovariancematrixV always exists.

• Ais the matrix of eigenvectors of V, D is thematrix of correspondingeigenvalues.

Reconstruction of from using:

Note:If is viewed as apoint inn-dimensionalEuclidean space,thenAdefines arotated coordinate.

9

IP1– Lecture 8:Imagecompression 1

D is adiagonalmatrix

(alsoknown as Hotelling Transformor Principal ComponentsAnalysis)y

x

y = Ax −µ( )

EyyT!

"#$= AE

x −µ( ) x − µ( )

T!"#

$%&AT = AVAT = D

!x = AT !y + !µx

yx

12.11.15 University of Hamburg, Dept. Informatics

Page 10: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Compression and Reconstruction using theKarhunen-Loève Transform

Let us assume that the Eigenvaluesand the corresponding Eigenvectors of A aresorted descending:

may be transformated into aK-dimensionalvector , with K < N using atransformation matrix AK,which contains only the first K Eigenvectors of A,whichcorrespond to the K largest Eigenvalues:

Theapproximate reconstruction minimizes the – mean square error (MSE)of arepresentation wit Kdimensions:

Thus,is representing alossy data compression .

10

IP1– Lecture 8:Imagecompression 1

Eigenvectors and Eigenvaluesare defined byand my be estimated by solving:

.

There exist fastsolution methods for the Eigenvaluesof realsymmetric matrices.

aV a = λ a

λ

det V −λI( )= 0

λiλ1 ≥ λ2 ≥…≥ λN

x yK

yK = AKx −µ( )

!x = AK

T yK +µ

!x

yK

D =

λ1 0 0

0 λ2 0

0 0 λ3

!

"

#####

$

%

&&&&&

12.11.15 University of Hamburg, Dept. Informatics

Page 11: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Illustrationof Minimum-lossDimensionReduction

Using the Karhunen-Loève transform,data compression is achieved by• changing (rotating) the coordinate system• omitting the leastinformativedimension(s) inthe new coodinate systemExample:

11

IP1– Lecture 8:Imagecompression 1

x1

x2

x1

x2

• •••• • •

• •

••••

••

y1y2

y1

y2

• • • •• ••••• • •• •• y1

12.11.15 University of Hamburg, Dept. Informatics

Page 12: 08 - Image Compression 1 - uni-hamburg.deseppke... · 2015-11-09 · MIN-Fakultät Fachbereich Informatik Arbeitsbereich SAV/BV (KOGS) Image Processing 1 (IP1) Bildverarbeitung 1

Numerical Example forKarhunen-Loève Compression

Given:

Eigenvaluesand eigenvectors

Compression intoK=2dimensions

Reconstruction fromcompressed values

12

IP1– Lecture 8:Imagecompression 1

Notethe discrepancies between theoriginaland the approximatedvalues:x1´= 0,5 x1 - 0,43 x2 - 0,25 x3

x2´= -0,085 x1 - 0,625 x2 + 0,39 x3

x3´= 0,273 x1 + 0,39 x2 + 0,25 x3

N = 3!xT = x1  x2  x3( )!m = 0

V =2 −0.866 −0.5

−0.866 2 0−0.5 0 2

"

#

$$$

%

&

'''

V =0.707 0 0.707−0.612 0.5 0.621−0.354 −0.866 0.354

"

#

$$$

%

&

'''

det(V −λI ) = 0   ⇒   λ1 = 3,  λ2 = 2,  λ3 =1  

!y2 = A2!x = 0.707 −0.612 −0.354

0 0.5 −0.866

"

#$

%

&'!x

!!x = AT

2!y =

0.707 0−0.612 0.5−0.354 −0.866

#

$

%%%

&

'

(((

!y

12.11.15 University of Hamburg, Dept. Informatics