context-based adaptive entropy coding xiaolin wu mcmaster university hamilton, ontario, canada

60
Context-Based Adaptive Entropy Coding Xiaolin Wu McMaster University Hamilton, Ontario, Canada

Upload: paula-gwen-powers

Post on 25-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • Context-Based Adaptive Entropy Coding Xiaolin Wu McMaster University Hamilton, Ontario, Canada
  • Slide 2
  • 2 Data Compression System lossless compression (entropy coding) of the quantized data entropy coding removes redundancy and achieves compression Entropy Coding a block of Transform n samples transform coefficients bit patterns for the Q-indices quantized coefficients Quantization
  • Slide 3
  • 3 Entropy Coding Techniques Huffman code Golomb-Rice code Arithmetic code Optimal in the sense that it can approach source entropy Easily adapts to non-stationary source statistics via context modeling (context selection and conditional probability estimation). Context modeling governs the performance of arithmetic coding.
  • Slide 4
  • 4 Entropy (Shannon 1948) Consider a random variable source alphabet : probability mass function : Self-information Self-information of is measured in bits if the log is base 2. event of lower the probability carries more information Self-entropy Self-entropy is the weighted average of self information
  • Slide 5
  • 5 Conditional Entropy Consider two random variables and Alphabet of : Conditional Self-information Conditional Self-information of is Conditional Entropy Conditional Entropy is the average value of conditional self-information
  • Slide 6
  • 6 Entropy and Conditional Entropy The conditional entropy can be interpreted as the amount of uncertainty remaining about the, given that we know random variable. The additional knowledge of should reduce the uncertainty about.
  • Slide 7
  • 7 Context Based Entropy Coders Consider a sequence of symbols form the context space, C S symbol to code A(0.2,0.8)C(0.9,0.1)B(0.4,0.6) 0 0 1 0 0 1 1 0 1 1 1 0 1 0 0 1 0 0 0 0 1 0 0 1 1 1 0 1 0 1 1 0 1 1 0 1 1 0... 0 0 0 0 0 1 1 1 1 EC
  • Slide 8
  • 8 Context model estimated conditional probability Variable length coding schemes need estimates of probability of each symbol - model Model can be Static - Fixed global model for all inputs English text Semi-adaptive - Computed for specific data being coded and transmitted as side information C programs Adaptive - Constructed on the fly Any source!
  • Slide 9
  • 9 Adaptive vs. Semi-adaptive Advantages of semi-adaptive Simple decoder Disadvantages of semi-adaptive Overhead of specifying model can be high Two-passes of data required Advantages of adaptive one-pass universal As good if not better Disadvantages of adaptive Decoder as complex as encoder Errors propagate
  • Slide 10
  • 10 Adaptation with Arithmetic and Huffman Coding Huffman Coding - Manipulate Huffman tree on the fly - Efficient algorithms known but nevertheless they remain complex. Arithmetic Coding - Update cumulative probability distribution table. Efficient data structure / algorithm known. Rest essentially same. Main advantage of arithmetic over Huffman is the ease by which the former can be used in conjunction with adaptive modeling techniques.
  • Slide 11
  • 11 Context models If source is not iid then there is complex dependence between symbols in the sequence In most practical situations, pdf of symbol depends on neighboring symbol values - i.e. context. Hence we condition encoding of current symbol to its context. How to select contexts? - Rigorous answer beyond our scope. Practical schemes use a fixed neighborhood.
  • Slide 12
  • 12 Context dilution problem The minimum code length of sequence achievable by arithmetic coding, if is known. The difficulty in estimating due to insufficient sample statistics, preventing the use of high-order Markov models.
  • Slide 13
  • 13 Estimating probabilities in different contexts Two approaches Maintain symbol occurrence counts within each context number of contexts needs to be modest to avoid context dilution Assume pdf shape within each context same (e.g. Laplacian), only parameters (e.g. mean and variance) different Estimation may not be as accurate but much larger number of contexts can be used
  • Slide 14
  • 14 Context Explosion Consider an image quantized to 16-levels causal context space 65,536 contexts! No enough data to learn the histograms the data dilution problem solution: quantize the context space
  • Slide 15
  • 15 Current Solutions Non-Binary Source JPEG: simple entropy coding without context J2K: ad-hoc context quantization strategy Binary Source JBIG: use suboptimal context templates of modest sizes
  • Slide 16
  • 16 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Context-Based Classification and Quantization Conclusions and Future Work
  • Slide 17
  • 17 Context Quantization Group the contexts with similar histogram Adaptive arithmetic coder 1 Adaptive arithmetic coder N Define Context Quantize Context Estimate Prob. Entropy Coders
  • Slide 18
  • 18 Distortion Measure Kullback-Leibler distance between pmf histograms Always non-zero
  • Slide 19
  • 19 Context Quantization Context Histograms Centroid Histogtrams Voronoi Region GLA (k-mean) algorithm Nearest Neighbor Centroid Computation Repeat until converge
  • Slide 20
  • 20 Experimental Results 1 st experiment: source use a controlled source with memory 1 st -order Gauss-Markov generate 10 7 samples with quantize into 32 levels flip sign of every sample with probability 0.5
  • Slide 21
  • 21 1 st experiment setup context space context size = 1024 all the dependence is here How many groups do we expect? max of 16 are needed all will be bimodal SL1L1 L2L2 L3L3
  • Slide 22
  • 22 1 st experiment: results vary M, the number of groups M Rate [bits/sym] 14.062 23.709 43.563 83.510 163.504 7743.493 only need 8! number of non-empty contexts
  • Slide 23
  • 23 histograms for M=16
  • Slide 24
  • 24 2 nd experiment: source a 512 512 natural image - barb wavelet transform, 9-7 filters, 3-levels each subband scalar quantized to 24 levels test a sub-image S0
  • Slide 25
  • 25 2 nd experiment: setup context space SL1L1 L3L3 L4L4 L2L2 context size = 313,776 group histogram structure? unknown
  • Slide 26
  • 26 2 nd experiment: results subband S0 (low pass) need our method to quantize the context space!
  • Slide 27
  • 27 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Context-Based Classification and Quantization Conclusions and Future Work
  • Slide 28
  • 28 Quantizer description Adaptive arithmetic coder 1 Adaptive arithmetic coder N Define Context Quantize Context Estimate Prob. Entropy Coders Describe Context Book
  • Slide 29
  • 29 Coarse Context Quantization 0 00 0 0 input output Full Resolution Quantizer Low Resolution Context Quantizers SX0X0 X1X1 X2X2 9*9*9=7296*5*3=90Number of contexts
  • Slide 30
  • 30 State Sequence Context Indices Group Indices Context Book 8 6 5 2 4 3 5 8 6 5 4 1 2 4 3 1 8 6 5 2 2 7
  • Slide 31
  • 31 Experiment: source a 512 512 natural image - barb wavelet transform, 9-7 filters, 3-levels each subband scalar quantized to 24 levels Experimental Results
  • Slide 32
  • 32 Experimental Results Subband LH3 Subband LH2 Side Information RateData Rate ???
  • Slide 33
  • 33 Experimental Results Subband LH3 Subband LH2 Side Information RateData Rate
  • Slide 34
  • 34 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context-Based Classification and Quantization Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Conclusions and Future work
  • Slide 35
  • 35 MDL-Based Context Quantizer Adaptive arithmetic coder 1 Adaptive arithmetic coder N Define Context Quantize Context Estimate Prob. Entropy Coders Describe Context Book
  • Slide 36
  • 36 Distortion Measure A(0.2,0.8)C(0.9,0.1)B(0.4,0.6) 0.10.20.30.40.50.60.70.80.91.00 ABC
  • Slide 37
  • 37 Context Quantization Context Histograms Centroid Histogtrams Voronoi Region Vector Quantization Scalar Quantization 0.10.20.30.40.50.60.70.80.91.00 (Local Optima) (Global Optima)
  • Slide 38
  • 38 Proposed Method Context mapping description Using training set to obtain pmf of common context Applying classification map method for rare context Minimum Description Length Context Quantizer Minimize the objective function Data Code Length Side Info. Code Length Dynamic programming method is applied to achieve the global minima
  • Slide 39
  • 39 Contributions A new context quantizer design approach based on the principle of minimum description length. An input-dependent context quantizer design algorithm with efficient handling of rare contexts Context quantization for minimum adaptive code length. A novel technique to handle the mismatch of training statistics and source statistics.
  • Slide 40
  • 40 Experimental Results Bit rates (bpp) of dithering halftone imagesBit rates (bpp) of error diffusion halftone images
  • Slide 41
  • 41 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Context-Based Classification and Quantization Conclusions and Future Work
  • Slide 42
  • 42 Motivation MDL-based context quantizer is designed mainly based on the training set statistics. If there is any mismatch in statistics between the input and the training set, the optimality of the predesigned context quantizer can be compromised
  • Slide 43
  • 43 Input-dependent context quantization Context Quantizer Design Raster scan the input image to obtain the conditional probabilities and the number of occurrence of each context instance Minimize the objective function by dynamic programming Reproduction pmfs are sent as the side information
  • Slide 44
  • 44 Handling of Rare Context Instances Context template definition Estimation of conditional probabilities from the training set Rare contexts : estimated by decreased size context Only used as initial one and being updated when coding the input image
  • Slide 45
  • 45 Coding Process Context Calculation Prob. Estimation Context Quantization Entropy Coders Update The context quantizer output may change according to the more accurate estimate of the conditional probability along the way
  • Slide 46
  • 46 Experimental Results Bit rates (bpp) of error diffusion halftone images
  • Slide 47
  • 47 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Context-Based Classification and Quantization Conclusions and Future Work
  • Slide 48
  • 48 Motivation To minimize the actual adaptive code length instead of static code length To minimize the effect of mismatch between the input and training set statistics
  • Slide 49
  • 49 Context Quantization for Minimum Adaptive Code Length The adaptive code length can be calculated The order of 0 and 1 appearance does not change the adaptive code length
  • Slide 50
  • 50 Context Quantization for Minimum Adaptive Code Length The objective function to minimize the effect of mismatch between the input and the training set
  • Slide 51
  • 51 Experimental Results Bit rates (bpp) of error diffusion halftone images
  • Slide 52
  • 52 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Context-Based Classification and Quantization Conclusions and Future Work
  • Slide 53
  • 53 Motivation pdf Quantizer after Classification Single Quantizer
  • Slide 54
  • 54 Single Quantizer TQEC prob Transform Coef.
  • Slide 55
  • 55 Classification and Quantization Q EC T IQ C 2 4 3 0 4 3 2 1 2 3 4 1 2 20.5 40.5 31 9 45 38 28 12 25 39.5 47 19 23.5 TC Transform Coef. Initial Q Group Indies 0 1 1 0 1 1 0 0 0 1 1 0 1
  • Slide 56
  • 56 Experimental Results
  • Slide 57
  • 57 Overview Introduction Context Quantization for Entropy Coding of Non-Binary Source Context Quantization Context Quantizer Description Context Quantization for Entropy Coding of Binary Source Minimum Description Length Context Quantizer Design Image Dependent Context Quantizer Design with Efficient Side Info. Context Quantization for Minimum Adaptive Code Length Context-Based Classification and Quantization Conclusions and Future Work
  • Slide 58
  • 58 Conclusions Non-Binary Source: New context quantization method is proposed Efficient context description strategies Binary Source: Global optimal partition of the context space MDL-based context quantizer Context-based classification and quantization
  • Slide 59
  • 59 Future Work Context-based entropy coder Context shape optimization Mismatch between the training set and test data Classification and Quantization Classification tree among the wavelet subbands Apply these techniques to video codec
  • Slide 60
  • 60 ?