Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod
Information Systems Lab Dept. of Electrical Eng. Stanford University
25 Mar 03
Outline Theory
of optimal quantizer design
y Coding schemes y Optimality conditions and Lloyd algorithm Experiments
y Gaussian scalar asymmetric case y Distortion and rate-distortion optimized design
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
2
Motivation Low-cost sensor network Remote Sensor
Remote Sensor
Central Unit
Side Information
Local Sensor
Remote Sensor
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
3
Quantization Design Problem Sender 1 Quantization
Receiver Coding
Reconstruction
X1
q1(x1)
Q1
Encoder1
xˆ (q, y )
Decoder X2
q2(x2)
Q2
Encoder2
Sender 2
Xˆ 1
Q1
Xˆ 2
Q2
Y
Reconstructed Source Vectors
Source Vectors
Indices
Side Information
Given
lossless coder Design quantizers David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
4
State of the Art Heuristic,
sub-optimal design
Extension
of the Lloyd algorithm, network quantization
[Kusuma et al., 2001] [Muresan and Effros, 2002]
[Fleming and Effros, 2001] y Distortion-only optimized y Rates as functions of the quantization index [Fleming et al., to appear]
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
5
RD Optimized Design Quantization
Coding
Reconstruction
X1
q1(x1)
Q1
Encoder1
xˆ (q, y )
Decoder X2
q2(x2)
Q2
Encoder2
Xˆ 1
Q1
Xˆ 2
Q2
Reconstructed Source Vectors
Source Vectors
Indices
Side Information
Y
Rate
measure r(q,y) models coder Costs Lagrangian cost Distortion
Rate
J = (1 − λ ) D + λR
R = E[r (Q, Y )]
D = E[d ( X , Xˆ )]
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
6
Rate Measure
r ( q, y )
− log pQ|Y (q1 , q2 | y )
(
− log pQ1 (q1 ) pQ2 (q2 ) l1 (q1 ) + l2 (q2 ) − a log pQ1 (q1 ) − blog pQ1|Y (q1 | y )
)
R = E[r (Q, Y )]
Coding Scheme
H (Q1 , Q2 | Y )
Distributed Slepian-Wolf coding (also joint coding).
H (Q1 ) + H (Q2 )
Separate encoding, all dependencies ignored.
E[l1 (Q1 ) + l2 (Q2 )]
Specific codebook with codeword lengths li(qi).
a H (Q1 ) + b H (Q1 | Y )
Linear combinations of previous cases. More general coding characterization.
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
7
Disconnected Quantization Regions
R1 I1 x0=xinf
I2 x1
qth region Rq
RN
Ii x2
xi-1 xi ith interval
IM xM-1
x xM=xsup
Side
information helps distinguish source values Reusing intervals allows reducing distortion and entropy of quantization index
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
8
Optimal Reconstruction Centroid
condition in the case of quadratic distortion
measure Estimate source vectors given quantization indices and side information xˆ * ( q, y ) = E[ X | Q = q, Y = y ] xˆ ( q, y ) fX|Y(x|y)
y
x
Rq David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
9
Optimal Quantization Choose
quantization index to minimize estimated cost Nearest neighbor condition ∀x1
boundaries x1i of intervals correspond to intersection points of j1(x1,q1)
~ q1 * ( x1 ) = arg min j1 ( x1 , q1 ) q1
estimated cost at encoder 1
boundaries x1i, and indices q1 minimizing j1(x1,q1), define q1*(x1)
~ j1 ( x1 , q1 )
1 3
4 2
q= 1 2 3 4 1 2
x1
x11 x12 x13 x14 x15 David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
10
Estimated Cost Functions Estimated Lagrangian cost at encoder 1
~ ~ j1 ( x1 , q1 ) = (1 − λ ) d1 ( x1 , q1 ) + λ ~ r1 ( x1 , q1 )
Estimated distortion at encoder 1
~ d1 ( x1 , q1 ) = E[d (( x1 , X 2 ), xˆ ((q1 , Q2 ), Y )) | X 1 = x1 ] Estimated rate at encoder 1 distortion measure
reconstruction function
~ r1 ( x1 , q1 ) = E[r ((q1 , Q2 ), Y ) | X 1 = x1 ] rate measure
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
11
New Lloyd Algorithm Quantizer
Choose initial quantizers
Best Reconstructor Quantizer
Best Quantizer
Find best reconstructor for current quantizers
xˆ ( k ) (q, y )
Update rate measure for current quantizers
r ( k ) ( q, y )
Find cost for current quantizers, reconstructor and rate measure
Update Rate Reconstructor & Rate
(1) k = 1 qi ( xi )
k = k +1
Convergence
J (k ) Y
End
N Find best quantizers for current ( k +1) q reconstructor and rate measure i ( xi )
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
12
Experiment Setup Source Value
Quantization Index
X
Q q (x)
Reconstructed Source Value
Q Encoder
Decoder
xˆ (q, y )
Xˆ
Y Z X ~ N (0, σ
2 X 2 Z
= 1)
Side Information
Noise
Z ~ N (0, σ ) X and Z statistically independent David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
13
Optimal Quantizers Distortion
optimized
y J=D, λ=0 y 4 quantization indices
Rate-distortion
optimized
y J= (1-λ)D+λR, λ > 0 y R=H(Q|Y)
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
14
Reconstruction and Cost Functions function
Estimated
cost function
xˆ (q , y )
~ j1(x1,q1) j (x , q )
Reconstruction
y
x
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
15
RD Optimized Design, R=H(Q) J=(1-
22
r ( q, y ) = − log pQ (q )
σ = = 5 dB σ σ X2
SNR OUT
=
D
18 16
2
SNR IN
2 X 2 Z
Wyner-Ziv Bound Conditional Asymmetric Distributed Independent with Side Info Uniform with Side Info
20
SNR SNR = σ X/D[dB] [dB] outOUT
λ )D+λ R
14 12
10 8
6
0
0.5
1
1.5
2
2.5
R [bit]
R [bit]
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
16
RD Optimized Design, R=H(Q|Y) J=(1-
λ )D+λ R
22
− log pQ|Y (q | y )
18
r ( q, y )
SNR IN
σ = = 5 dB σ σ X2
SNR OUT
2 X 2 Z
=
D
SNR=OUT SNR [dB] σ 2X/D[dB] out
=
20
Wyner-Ziv Bound Conditional Asymmetric Distributed Independent with Side Info Uniform with Side Info
16
14 12
10
8
6
0
0.5
1
1.5
2
2.5
R [bit]
R [bit]
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
17
Conclusions Design
of entropy-constrained quantizers with Lloyd algorithm Coding scheme modeled by rate measure r(q,y) Experiments with Gaussian statistics y Nearly-uniform quantizers y When R=H(Q) (and also J=D) ` Intervals highly reused ` Improvement w.r.t. classical EC quantizer (∆SNROUT≅1 dB at SNRIN=5 dB, grows with SNRIN)
y When R=H(Q|Y) ` Performance almost identical to conditional quantizer ` Also almost identical to classical EC, reusing intervals not as important
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
18
Design of Optimal Quantizers for Distributed Source Coding David Rebollo-Monedero, Rui Zhang and Bernd Girod
Information Systems Lab Dept. of Electrical Eng. Stanford University
25 Mar 03
Optimal Rate Measure Rate
measure needs to be updated
y If quantizer changes, then probability of quantization index changes rate measure
r (q, y ) = − log pS |Y (q | y )
quantization indices
(q1 , q2 )
old probability of quantization indices
r * (q, y ) = − log pQ|Y (q | y ) rate measure that minimizes (expected) rate
new probability of quantization indices
y Similarly for other coding cases y When working with specific codeword lengths, a method for designing improved codes is required Simpler
rate measures may lead to simpler estimated rate codeword lengths functions r (q, y ) = r (q ) = l1 (q1 ) + l2 (q2 ) rate measure
~ r1 ( x1 , q1 ) = ~ r1 (q1 ) = l1 (q1 ) estimated rate function at encoder 1
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
20
Distortion Optimized Design 35
J=D 4
quantization indices
SNR OUT
σ σ σ X2
=
D
25
2
=
SNR = σ X/D [dB] SNR [dB] out OUT
SNR IN
2 X 2 Z
30
Wyner-Ziv Bound (Rate=2) Conditional Asymmetric Distributed Independent with Side Info Ignoring Side Info Uniform with Side Info
20
15
10
5 -10
-5
0
5
10
15
20
SNR = σ 2X/[dB] σ 2Z [dB] SNR in IN
David Rebollo et al.: Design of Optimal Quantizers for Distributed Source Coding
21