• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

• Whenever you search in PBworks, Dokkio Sidebar (from the makers of PBworks) will run the same search in your Drive, Dropbox, OneDrive, Gmail, and Slack. Now you can find what you're looking for wherever it lives. Try Dokkio Sidebar for free.

View

# Scribe Note 7-1

last edited by 11 years, 2 months ago

Q6

a)We have defined the information rate distortion function as where the minimization is over all conditional distributions for which the joint distribution satisfies the expected distortion constraint. This is a standard minimization problem of a convex function over the convex set of all satisfying for all and We can use the method of Lagrange multipliers to find the solution.  We set up the functional Differentiating with respect to , setting , we obtain Since , we must have or for all . We can combine these equations with the equation defining the distortion and calculate λ and the unknowns . We can  find the optimum conditional distribution.

The above analysis is valid if is unconstrained. The inequality condition is covered by the Kuhn–Tucker conditions, which reduce to Substituting the value of the derivative, we obtain the conditions for the minimum as This characterization will enable us to check if a given is a solution to the minimization problem. However, it is not easy to solve for the optimum output distribution from these equations. In the next section we provide an iterative algorithm for computing the rate distortion function. This algorithm is a special case of a general algorithm for finding the minimum relative entropy distance between two convex sets of probability densities.

b)

Consider the following problem: Given two convex sets A and B in Rn as shown in following figure, we would like to find the minimum distance between them: where d(a, b) is the Euclidean distance between a and b. An intuitively obvious algorithm to do this would be to take any point x ∈ A, and find the y ∈ B that is closest to it. Then fix this y and find the closest point in A. Repeating this process, it is clear that the distance decreases at each stage. Does it converge to the minimum distance between the two sets? Csiszar and Tusnady have shown that if the sets are convex and if the distance satisfies certain conditions, this alternating minimization algorithm will indeed converge to the minimum. In particular, if the sets are sets of probability distributions and the distance measure is the relative entropy, the algorithm does converge to the minimum relative entropy between the two sets of distributions. To apply this algorithm to rate distortion, we have to rewrite the rate distortion function as a minimum of the relative entropy between two sets.

c）

let , we have d)

Use result from c), we have e)

If A is the set of all joint distributions with marginal p(x) that satisfy the distortion constraints and if B the set of product distributions with arbitrary , we can write f)

We now apply the process of alternating minimization, which is called the Blahut–Arimoto algorithm in this case. We begin with a choice of λ and an initial output distribution and calculate the that minimizes the mutual information subject to the distortion constraint. We can use the method of Lagrange multipliers for this minimization to obtain For this conditional distribution , we calculate the output distribution that minimizes the mutual information, which is .

We use this output distribution as the starting point of the next iteration. Each step in the iteration, minimizing over q(·|·) and then minimizing over r(·). Thus, there is a limit, and the limit has been shown to be R(D)

The value of D and R(D) depends on λ. Thus, choosing λ appropriately sweeps out the R(D) curve.

g)

A similar procedure can be applied to the calculation of channel capacity. Again we rewrite the definition of channel capacity. It is a double maximization problem.