*UNFINISHED*
There’re many ways of talking about information, but basically it’s more about new things that cannot be predicted. i.e. what we don’t know about the world? Claude Shannon, the father of information theory, once said that information is the resolution of uncertainty. It involves statistic property & probability. For example, what will happen when you toss a coin many times?
In this course, a lot attention would be paid to biased coin problems. For instance, consider a black and white picture of a dark room, the probability of a dot being dark, denoted by 1, would be larger than the probabily of the dot being light, denoted by 0. Or consider a digitalized signal for noise, the probability of next signal contains no noise, denoted by 0, would be larger than the probability of the next signal to be noise, denoted by 1, if the channel is rather clear.
Data compression:
n lossless source coding
n rate-distortion theory. i.e. images(within some error, compress as much as one can)
Problem Set 1
Background knowledge:
Independence c.d.f. cumulative distribution function of x.
Pr(x <= x) = f(x)
Two random variables X and Y are independent if and only if
P(X <= x, Y <= y) = P(X <= x)P(Y <= y)
When there’re 3 variables. Each two of them can be independent with each other but the three of them are dependent.
i.i.d. identical & independently distributed.
How many different sequences there are for tossing a coin ten times? 2^10
Comments (0)
You don't have permission to comment on this page.