| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Dokkio Sidebar (from the makers of PBworks) is a Chrome extension that eliminates the need for endless browser tabs. You can search all your online stuff without any extra effort. And Sidebar was #1 on Product Hunt! Check out what people are saying by clicking here.

View
 

Scribe Note 2( 22-Jan )

Page history last edited by Harry 13 years, 7 months ago

1. Fano's inequality

Intuition:  If Y is a function of X, Y = g(X), then  H(Y|X) =0 . If we can only estimate X from Y with some probability of  error P(e), we can image that H(X|Y) will be small if P(e) is small ,and   H(X|Y)->0 , as P(e) -> 0.

Proof:  we model the process as a Markov Chain

                X------> Y--------> Formula

we define E as follows:

               E  =  1,    if  Formula

                   =  0,    if  Formula

 

We expand Formula in two ways,

              Formula

               Formula

for the second equation,  

               Formula               

              Formula

                                     Formula

                                     Formula                                                   (1)

,which means

            Formula

 

for the first equation, since E is a function of X and Formula, then 

          Formula 

now,   Formula                                                       (2)

 

From (1),(2), we obtain

 

                          Formula     (3)

Next , we will prove

                     Formula

From data-processing inequality, we have

                    Formula

since

               Formula

               Formula

,we obtain

               Formula                                        (4)

 

From (3),(4)

               Formula

viz

 

               Formula

 


2. Converse for source coding theorem.

Formula---->,Formula------>Formula form a markov chain.

 

Formula                      (i.i.d)

                Formula        ( H(x) = H(x|y) + I(x;y) )

                Formula    (data processing inequality in the above Markov chain)

                Formula           ( I(x;y)<= H(y))

                Formula        ( H(x)<= log|X| )     

                Formula     (Fano's inequality)

,which means 

 Formula              

 

 

 

 

 

Comments (8)

Mak said

at 8:12 pm on Feb 9, 2009

Is this note finished?

It is written in Scribe Notes 2 Feb that, "Last time, we proved the converse of the source coding theorem by ingenious use of several information-theoretic inequalities." But I can't seem to find the related info here?

Cho Yiu Ng said

at 8:19 pm on Feb 10, 2009

As I recall, there were certain discussions about this. So, Cheng Fan, would you mind writing down what we have discussed?

Harry said

at 10:28 pm on Feb 14, 2009

sorry,professor in class indeed discussed something about the converse of the source coding theorem. I will update it .

Cho Yiu Ng said

at 8:26 pm on Feb 22, 2009

Thanks Cheng Fan! Are there any comments?

Cho Yiu Ng said

at 11:32 am on Feb 24, 2009

As a number of you made a mistake in the problem set 3, I ask a question here (which was raised by Yuchen in the lecture as well). Is it correct to apply Fano's Inequality in the converse of Source Coding Theorem above? Please compare it carefully with the Fano's Inequality. On the other hand, if such application is incorrect, can we change the proof a little bit so that we can apply the Fano's Inequality?

Silas said

at 4:39 pm on Feb 24, 2009

I recall that we can follow the steps of the proof in the Fano's inequality to obtain the last inequality. The Fano's inequality utilizes the last inequality mentioned in the scribe note.

Silas said

at 4:41 pm on Feb 24, 2009

I see. We can make g(X) = X and apply the Fano's inequality.

Cho Yiu Ng said

at 4:20 pm on Feb 25, 2009

What is g(X)?

You don't have permission to comment on this page.