(2) 20 EE398A Project – Winter 2010/2011 Mar

Entropy-constrained overcomplete-based
coding of natural images
André F. de Araujo, Maryam Daneshi, Ryan Peng
Stanford University
Outline






Motivation
Overcomplete-based coding: overview
Entropy-constrained overcomplete-based coding
Experimental results
Conclusion
Future work
EE398A Project – Winter 2010/2011
Mar. 10, 2011
2
Motivation (1)

Study of new (and unusual) schemes for image
compression

Recently, new methods have been developed using the
overcomplete approach

Restricted scenarios for compression

Did not fully exploit this approach’s characteristics for compression
EE398A Project – Winter 2010/2011
Mar. 10, 2011
3
Motivation (2)
Why? Sparsity on coefficients  better overall RD
EE398A Project – Winter 2010/2011
Mar. 10, 2011
4
Overcomplete coding: overview (1)

K > N implies:
 Bases are not linearly independent
 Example:
 8x8 blocks: N = 64 basis functions are needed to span the space
of all possible signals


Overcomplete basis could have K = 128
Two main tasks:
1. Sparse coding
2. Dictionary learning
EE398A Project – Winter 2010/2011
Mar. 10, 2011
5
Overcomplete coding: overview (2)
1.
Sparse coding (“atom decomposition”)

Compute the representation coefficients x based on
the signal y (given) and dictionary D (given)


overcomplete D  Infinite solutions  approxim.
Commonly used algorithms: Matching Pursuits (MP),
Orthogonal Matching Pursuits (OMP)
EE398A Project – Winter 2010/2011
Mar. 10, 2011
6
Overcomplete coding: overview (3)
Sparse coding (OMP)
Input: Dictionary 𝐷, signal 𝑦, number of non-zero coefficients (NNZ) 𝐿
(or error target ε)
Output: Coefficient vector x
1. Set r = 𝑦 (r: residual)
2. Project r on every basis of 𝐷
3. Select 𝑑𝑖 from 𝐷 with maximum projection
4.
𝑥 = 𝑝𝑖𝑛𝑣 𝐷(𝑖𝑛𝑑) ∗ 𝑦
5.
𝑟 = 𝑦 − 𝐷𝑥
6. Stop if 𝑁𝑁𝑍 = 𝐿 (or ||r||2 < ε). Otherwise, go to 2
EE398A Project – Winter 2010/2011
Mar. 10, 2011
7
Overcomplete coding: overview (4)
2.
Dictionary learning

Two basic stages (analogy with K-means)
i.
Sparse coding stage: use a pursuit algorithm to compute x (OMP is
usually employed)
ii.
Dictionary update stage: adopt a particular strategy for updating the
dictionary

Convergence issues: as first stage does not guarantee
best match, cost can increase and convergence cannot be
assured
EE398A Project – Winter 2010/2011
Mar. 10, 2011
8
Overcomplete coding: overview (5)
2.
Dictionary learning



Most relevant algorithms in the literature: K-SVD and MOD
Sparse coding stage is done in the same way
Codebook update stage is different:
 MOD


Update entire dictionary using optimal adjustment for a given
coefficients matrix
K-SVD

Update each basis one at a time using SVD formulation

Introduces change in dictionary and coefficients
EE398A Project – Winter 2010/2011
Mar. 10, 2011
9
Entropy-const. OC-based coding (1)

We introduce a compression scheme which employs
entropy-constrained stages

RD-OMP


Introduced by Gharavi-Alkhansar (ICIP 1998), uses the
Lagrangian cost 𝐽 = 𝐷 + λ𝑅 with variable NNZ coefficients
to select basis vectors
EC Dictionary Learning

Introduced in this work, uses a framework inspired in EC
VQ to select basis vectors
EE398A Project – Winter 2010/2011
Mar. 10, 2011
10
Entropy-const. OC-based coding (2)

RD-OMP – key ideas

Introduction of Lagrangian cost
 Estimation of rate cost: 𝑅 = 𝑅𝑖𝑛𝑑 + 𝑅𝑐𝑜𝑒𝑓𝑓𝑠 + 𝑅𝐸𝑂𝐵
(𝑅𝐸𝑂𝐵 is fixed)

Stopping criterion/variable NNZ coefficients
 Once no more improvement is reached on the
Lagrangian cost, algorithm stops
EE398A Project – Winter 2010/2011
Mar. 10, 2011
11
Entropy-const. OC-based coding (3)
RD-OMP
Input: Dictionary 𝐷, Input signal 𝑦
Output: coefficient vector 𝑥
1. For every basis k (from 1 to K)
1. 𝑥 = 𝑝𝑖𝑛𝑣 𝐷(𝑖𝑛𝑑) ∗ 𝑦
2.
calculate 𝐽(𝑘) = 𝐷 + λ𝑅
2. Pick coefficient with smallest 𝐽
3. 𝑟 = 𝑦 − 𝐷𝑥
4. Stop if
𝐽𝑛−1 −𝐽𝑛
𝐽𝑛−1
< 𝜀, otherwise go to 1.
EE398A Project – Winter 2010/2011
Mar. 10, 2011
12
Entropy-const. OC-based coding (4)

EC Dictionary Learning – key ideas


Dictionary update strategy

K-SVD modifies dictionary and coefficients - reduction in
Lagrangian cost is not assured.

We use MOD, which provides the optimal adjustment assuming
fixed coefficients
Introduction of “Rate cost update” stage

Analogous to ECVQ algorithm for training data

Two pmfs must be updated: indexes and coefficients
EE398A Project – Winter 2010/2011
Mar. 10, 2011
13
Entropy-const. OC-based coding (5)
EC-Dictionary Learning
Input: input signal y
Output: Dictionary 𝐷
1. Initialize 𝐷 from 𝑦
2. Sparse coding stage:
RD-OMP  find coefficient 𝒙
3. Rate cost update stage:
4.
1. pmfs update (indexes and coefficients)
2. Codeword length update: 𝑙𝑖 = − log 𝑝 𝑖
Dictionary update stage:
MOD dictionary update
5. Stop when
𝐽𝑛−1 −𝐽𝑛
𝐽𝑛−1
< 𝜀, Otherwise go to 2
EE398A Project – Winter 2010/2011
Mar. 10, 2011
14
Experiments (Setup)
Rate calculation: optimal codebook (entropy) for each
subband
 Test images: Lena, Boats, Harbour, Peppers
 Training dictionary experiments

 Training
data: 18 Kodak downsampled (to 128x128)
images (does not include images being coded)
 Use of downsampled images to 128x128, due to very high
computational complexity (for other experiments, higher
resolutions were employed: 512x512, 256x256)
EE398A Project – Winter 2010/2011
Mar. 10, 2011
15
Experiments (Sparse Coding)

Comparison of Sparse coding methods
EE398A Project – Winter 2010/2011
Mar. 10, 2011
16
Experiments (Dict. learning)

Comparison of dictionary learning methods
EE398A Project – Winter 2010/2011
Mar. 10, 2011
17
Experiments (Compression schemes) (1)

1: Training and
coding for the same
image (dictionary is
sent)

2: Training with a set
of natural images
and applying to other
images
EE398A Project – Winter 2010/2011
Mar. 10, 2011
18
Experiments (Compression schemes) (2)
EE398A Project – Winter 2010/2011
Mar. 10, 2011
19
Experiments (Compression schemes) (3)
EE398A Project – Winter 2010/2011
Mar. 10, 2011
20
Conclusion

Improvement of sparse coding:
 RD-OMP

Improvement of dictionary learning
 Entropy-constrained overcomplete dictionary learning

Better overall performance compared to standard
techniques
EE398A Project – Winter 2010/2011
Mar. 10, 2011
21
Future work

Extension of implementation to higher resolution images

Further investigation of trade-off between K and N

Evaluation against directional transforms

Low complexity implementation of the algorithms
EE398A Project – Winter 2010/2011
Mar. 10, 2011
22