Q6

a)We have defined the information rate distortion function as

where the minimization is over all conditional distributions for which the joint distribution satisfies the expected distortion constraint. This is a standard minimization problem of a convex function over the convex set of all satisfying for all and .

We can use the method of Lagrange multipliers to find the solution. We set up the functional

Differentiating with respect to , setting , we obtain

Since , we must have or

for all . We can combine these equations with the equation defining the distortion and calculate λ and the unknowns . We can find the optimum conditional distribution.

The above analysis is valid if is unconstrained. The inequality condition is covered by the Kuhn–Tucker conditions, which reduce to

Substituting the value of the derivative, we obtain the conditions for the minimum as

This characterization will enable us to check if a given is a solution to the minimization problem. However, it is not easy to solve for the optimum output distribution from these equations. In the next section we provide an iterative algorithm for computing the rate distortion function. This algorithm is a special case of a general algorithm for finding the minimum relative entropy distance between two convex sets of probability densities.

## Comments (0)

You don't have permission to comment on this page.