| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Scribe Notes 1

This version was saved 15 years, 3 months ago View current version     Page history
Saved by sidjaggi
on January 18, 2009 at 7:12:02 pm
 

Introduction to Information/"Worst-case compression" (Scribe: XIA Lingxiao)

 

Claude Elwood Shannon, the father of Information Theory, once said[1]  "Information is the resolution of uncertainty". Things that cannot be predicted, i.e., what we don’t know about the world, usually draw most of our attention. In this class we shall not discuss the semantic meaning of information; rather, we formulate an operational definition. That is, we try to model information mathematically to model randomness, rather than pondering the philosophical implications of this randomness. More precisely, we examine the statistical and probabilistic properties of "information". Our basic example, one we shall return to many times, is, What happens when a (biased) coin is tossed many times? For instance, in a black-and-white picture, a dark pixel may be denoted by a 1 and a light pixel by a 0. Statistically, the probability of a 1 should be significantly higher than that of a 0 -- perhaps this observation can be used to compress the image (i.e., transfer fewer bits than the number of pixels in the original image). If we are willing to tolerate some distortion in the image, we may be able to compress the image even more. These examples correspond to two formulations of data compression problems, both of which we shall examine in this first course in information theory.

1.Lossless source coding (Exact reconstruction of the source's data is required).

2. Rate-distortion theory (Given a budget of maximal reconstruction distortion, the goal is to minimize the number of bits in the compressed data.)

 

Another problem of interest might be that of transmitting information over a noisy channel. For example, a sequence of bits is transmitted wirelessly, and if the receiver decodes a bit to the same value that the transmitter had transmitted, then a 0 is said to have been added to the signal, or else a 1 is.Depending on how noisy the ether is, careful design of coding schemes might allow an entire message to be transmitted with low error, even though individual bits might have a significant probability of decoding error. The set of problems corresponding to this problem formulations is called

3. Channel coding (Transmission of information over a noisy channel)

 

These are but three examples of a vast array of possible problems relating to information compression, storage, and tranmission, that one could conceivably be interested in. In this course we examine the theory and applications surrounding these three problems, and leave other problem formulations for more advanced classes.

 

 

Some knowledge of basic probability theory is assumed in this class -- this is summarized here.

 

Problem Set 1

 

A certain fair coin is tossed n times. Each time it is tossed, the result is independent of the previous toss. This is an example of an i.i.d. binary random variable.

Q1: (Worst-case "compression")

(a) If the coin described above is tossed n times, how many distinct length-n sequences of coin tosses are possible?

        Formula

(b) Given N possible outcomes of a random variable, how many bits (elements from {0,1}) are required to assign a unique length-N bit-sequence to each possible outcome?

         Formula (the ceiling function of the binary logarithm of N)

(c) Describe a method based on binary trees, to assign a length-N bit-sequence to an outcome, based on a binary tree, and a corresponding method to reconstruct the coin-toss sequence from the bit-sequence.

        Algorithm:

Coin tossing

Formula

Bit sequence

Heads

Formula

0

Tails

Formula

1

        Property: easy to construct/recover, one-to-one

(d) If the encoding alphabet is ternary (eg: {0,1,2}) instead of binary, how does your answer to (b) change? How about your answer to (c), if ternary trees are allowed? What if you were tossing a six-sided dice instead of a coin? How do your answers change? What is the time-complexity of your assignment schemes?

        Time complexity: O(n), more details in the next class.....

 

 

"I don't get the names of things, I just know how often they happen. " [Lingxiao: I don't know where I heard this, but seems like this quote belongs somewhere in this note]

 

 

Homework: check out Stirling's Approximation.

Footnotes

  1. http://www.brainyquote.com/quotes/authors/c/claude_shannon.html

Comments (0)

You don't have permission to comment on this page.