| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Basic Probability Theory

Page history last edited by sidjaggi 12 years, 6 months ago

Basic Probability Theory

 

Intuitively[1], a random variable (r.v.) X is a function mapping its sample space Formulato the set Formula of real numbers.

 

Exercise: Read about sample spaces.

(a) A fair six-sided dice is tossed. What is a "natural" sample space? What is a "natural" definition for the corresponding random variable.

(b) Two fair six-sided dies are tossed. Define the sample space to be the set of possible values for the sum of the values on the two dies. What is the "natural" definition for the corresponding random variable?

 

 

Definition: The cumulative distribution function (c.d.f.) of a real-valued random variable Formula represents the probability that the random variable X takes on a value less than or equal to x. The c.d.f. can be defined for both discrete and continuous real-valued random variables.

 

Definition: The probability mass function (p.m.f.) of a discrete random variable Formula represents the probability that the discrete random variable X takes on a value equal to x.

 

Definition: The probability density function (p.d.f.) Formula of a discrete or continuous (or mixed!) random variable is an analogue of the p.m.f. -- the integral of the p.d.f. over any interval represents the probability that the random variable X takes on a value in the interval, i.e., Formula .

 

 

Exercise: Read about joint distributions and conditional distributions of sets of random variables. Construct two non-trivial examples of each.

 

 

Exercise: Construct c.d.f.s and p.d.f.s for any two examples of continous random variables, and c.d.f.s, p.m.f.s and p.d.f.s for any two examples of discrete random variables.

 

 

Definition: A discrete-time stochastic process (the only kind of stochastic processes we'll consider at all in this class) is just a sequence of random variables.

 

 

Exercise: Read about (strictly) stationary stochastic processes (intuitively, in a stationary process the joint distributions of any subsets of random variables are identical, i.e., stationary). Construct two examples of stationary stochastic processes, and two examples of non-stationary stochastic processes.

 

 

Exercise: Read about ergodic stochastic processes (intuitively, in an ergodic process a sequence of n observations of a random variable behaves in exactly the same manner as n copies of that random variable). Construct two examples of ergodic processes, and two examples of non-ergodic processes.

 

 

Exercise: Read about independent and conditionally independent random variables (intuitively, two r.v.s X and Y are independent if the probability of events in one's sample space is independent of any event in the other's sample space; two r.v.s X and Y are conditionally independent given a third r.v. Z if the probability of events in X's sample space is independent of any event in Y's sample space (and vice versa), given that the value of Z is fixed and known). For three (or more) random variables, what are the appropriate definitions? (Hint -- the number of conditions that have to be satisfied grows exponentially with the number of random variables). Give examples of a sequence of n random variables such that any n-1 are independent, but all n together are not. Can you generalize this to an example where, for any k < n, any set of k r.v.s are independent, but any set of k+1 r.v.s are not?

 

 

Exercise: Read about Markov chains (intuitively, a Markov chain is a sequence of random variables such at any instant, the past and the future are conditionally independent given the present). Give two examples of Markov chains.

Footnotes

  1. Definition from wikipedia http://en.wikipedia.org/wiki/Random_variable

Comments (4)

FENG Yiyong said

at 10:34 pm on Sep 17, 2011

Hi, for this problem, "Give examples of a sequence of n random variables such that any n-1 are independent, but all n together are not. Can you generalize this to an example where, for any k < n, any set of k r.v.s are independent, but any set of k+1 r.v.s are not?" Here should we give some general examples with arbitrary n>3? Because when n=3, it seems that it's easy to get some examples, however for arbitrary n>3, it's seems not so easy... does anyone have some idea?

Chin Ho Lee said

at 10:40 pm on Sep 17, 2011

The example mentioned in class (i.e. The 3rd random variable being the XOR of the first two) can be easily generalized for arbitrary n.

FENG Yiyong said

at 11:26 pm on Sep 17, 2011

wow, I get your idea, it's true...thanks very much~

chengfan said

at 3:52 pm on Sep 19, 2011

search ``erasure code''

You don't have permission to comment on this page.