Introduction to Information/"Worstcase compression" (Scribe: XIA Lingxiao)
Claude Elwood Shannon, the father of Information Theory, once said "Information is the resolution of uncertainty". Things that cannot be predicted, i.e., what we don’t know about the world, usually draw most of our attention. In this class we shall not discuss the semantic meaning of information; rather, we formulate an operational definition. That is, we try to model information mathematically to model randomness, rather than pondering the philosophical implications of this randomness. More precisely, we examine the statistical and probabilistic properties of "information". Our basic example, one we shall return to many times, is, What happens when a (biased) coin is tossed many times? For instance, in a blackandwhite picture, a dark pixel may be denoted by a 1 and a light pixel by a 0. Statistically, the probability of a 1 should be significantly higher than that of a 0  perhaps this observation can be used to compress the image (i.e., transfer fewer bits than the number of pixels in the original image). If we are willing to tolerate some distortion in the image, we may be able to compress the image even more. These examples correspond to two formulations of data compression problems, both of which we shall examine in this first course in information theory.
1.Lossless source coding (Exact reconstruction of the source's data is required).
2. Ratedistortion theory (Given a budget of maximal reconstruction distortion, the goal is to minimize the number of bits in the compressed data.)
Another problem of interest might be that of transmitting information over a noisy channel. For example, a sequence of bits is transmitted wirelessly, and if the receiver decodes a bit to the same value that the transmitter had transmitted, then a 0 is said to have been added to the signal, or else a 1 is.Depending on how noisy the ether is, careful design of coding schemes might allow an entire message to be transmitted with low error, even though individual bits might have a significant probability of decoding error. The set of problems corresponding to this problem formulations is called
3. Channel coding (Transmission of information over a noisy channel)
These are but three examples of a vast array of possible problems relating to information compression, storage, and tranmission, that one could conceivably be interested in. In this course we examine the theory and applications surrounding these three problems, and leave other problem formulations for more advanced classes.
Some knowledge of basic probability theory is assumed in this class  this is summarized here.
Problem Set 1
A certain fair coin is tossed n times. Each time it is tossed, the result is independent of the previous toss. This is an example of an i.i.d. binary random variable.
Q1: (Worstcase "compression")
(a) If the coin described above is tossed n times, how many distinct lengthn sequences of coin tosses are possible?
(b) Given N possible outcomes of a random variable, how many bits (elements from {0,1}) are required to assign a unique lengthN bitsequence to each possible outcome?
(the ceiling function of the binary logarithm of N)
(c) Describe a method based on binary trees, to assign a lengthN bitsequence to an outcome, based on a binary tree, and a corresponding method to reconstruct the cointoss sequence from the bitsequence.
Algorithm:
Coin tossing


Bit sequence

Heads


0

Tails


1

Property: easy to construct/recover, onetoone
(d) If the encoding alphabet is ternary (eg: {0,1,2}) instead of binary, how does your answer to (b) change? How about your answer to (c), if ternary trees are allowed? What if you were tossing a sixsided dice instead of a coin? How do your answers change? What is the timecomplexity of your assignment schemes?
Time complexity: O(n), more details in the next class.....
"I don't get the names of things, I just know how often they happen. " [Lingxiao: I don't know where I heard this, but seems like this quote belongs somewhere in this note]
Homework: check out Stirling's Approximation.
Comments (0)
You don't have permission to comment on this page.