• If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!


Scribe Notes 1

This version was saved 15 years, 6 months ago View current version     Page history
Saved by Lingxiao XIA
on January 10, 2009 at 1:39:58 pm







There’re many ways of talking about information, but basically it’s more about new things that cannot be predicted. i.e. what we don’t know about the world? Claude Shannon, the father of information theory, once said that information is the resolution of uncertainty. It involves statistic property & probability. For example, what will happen when you toss a coin many times?


In this course, a lot attention would be paid to biased coin problems. For instance, consider a black and white picture of a dark room, the probability of a dot being dark, denoted by 1, would be larger than the probabily of the dot being light, denoted by 0. Or consider a digitalized signal for noise, the probability of next signal contains no noise, denoted by 0, would be larger than the probability of the next signal to be noise, denoted by 1, if the channel is rather clear.

Data compression:

n         lossless source coding

n         rate-distortion theory. i.e. images(within some error, compress as much as one can)


Problem Set 1

Background knowledge:

Independence c.d.f. cumulative distribution function of x.

Pr(x <= x) = f(x)

Two random variables X and Y are independent if and only if

P(X <= x, Y <= y) = P(X <= x)P(Y <= y)

When there’re 3 variables. Each two of them can be independent with each other but the three of them are dependent.



i.i.d. identical & independently distributed.


How many different sequences there are for tossing a coin ten times? 2^10




Comments (0)

You don't have permission to comment on this page.