Q6
a)We have defined the information rate distortion function as
where the minimization is over all conditional distributions for which the joint distribution satisfies the expected distortion constraint. This is a standard minimization problem of a convex function over the convex set of all satisfying for all and .
We can use the method of Lagrange multipliers to find the solution. We set up the functional
Differentiating with respect to , setting , we obtain
Since , we must have or
for all . We can combine these equations with the equation defining the distortion and calculate λ and the unknowns . We can find the optimum conditional distribution.
The above analysis is valid if is unconstrained. The inequality condition is covered by the Kuhn–Tucker conditions, which reduce to
Substituting the value of the derivative, we obtain the conditions for the minimum as
This characterization will enable us to check if a given is a solution to the minimization problem. However, it is not easy to solve for the optimum output distribution from these equations. In the next section we provide an iterative algorithm for computing the rate distortion function. This algorithm is a special case of a general algorithm for finding the minimum relative entropy distance between two convex sets of probability densities.
b)
Consider the following problem: Given two convex sets A and B in Rn as shown in following figure, we would like to find the minimum distance between them:
where d(a, b) is the Euclidean distance between a and b. An intuitively obvious algorithm to do this would be to take any point x ∈ A, and find the y ∈ B that is closest to it. Then fix this y and find the closest point in A. Repeating this process, it is clear that the distance decreases at each stage. Does it converge to the minimum distance between the two sets? Csiszar and Tusnady have shown that if the sets are convex and if the distance satisfies certain conditions, this alternating minimization algorithm will indeed converge to the minimum. In particular, if the sets are sets of probability distributions and the distance measure is the relative entropy, the algorithm does converge to the minimum relative entropy between the two sets of distributions.
To apply this algorithm to rate distortion, we have to rewrite the rate distortion function as a minimum of the relative entropy between two sets.
c)
let , we have
d)
Use result from c), we have
e)
If A is the set of all joint distributions with marginal p(x) that satisfy the distortion constraints and if B the set of product distributions with arbitrary , we can write
f)
We now apply the process of alternating minimization, which is called the Blahut–Arimoto algorithm in this case. We begin with a choice of λ and an initial output distribution and calculate the that minimizes the mutual information subject to the distortion constraint. We can use the method of Lagrange multipliers for this minimization to obtain
For this conditional distribution , we calculate the output distribution that minimizes the mutual information, which is .
Comments (0)
You don't have permission to comment on this page.