- Source-channel seperation
We have a source , that generates symbols from an alphabet .
Assumption: it is from a finite alphabet and satisfies the AEP. E.g., a sequence of i.i.d. random variables, sequence of states of a stationary irreducible Markov chain.
We want to send he sequence of symbols over the channel so that the receiver can reconstruct the sequence. to do this, we map the sequence onto a codeword and send he codeword over he channel. The receiver looks at his received sequence and makes an estimate of the sequence of that was sent. The receiver makes an error if .
Define the probability of error
Theorem (source-channel coding theorem): if is a finite alphabet stochastic process that satisfies the AEP, then there exists a source channel code with tends to 0 if
Conversely, for any stationary stochastic process, if , the probability of error is bounded away from zero, and it is not possible to send the process over the channel with arbitrarily low probability of error.
Proof:
Achievability:
Since we have assumed that the stochastic process satisfies the AEP, it implies that there exists a typical set of size which contains most of the probability. we will encode the source sequences belonging to the typical set; all other sequences will result in an error. This will contribute at most to the probability of error.
We index all the sequences belonging to . Since there are at most such sequences, bits suffice to index them. We can transmit the desired index to the receiver with probability of error less than if .
To be precise
for n sufficiently large
Hence, we can reconstruct the sequence with low probability of error for n sufficiently large if
Converse: we wil show that tends to 0 implies that for any sequence of source-channel codes
Hence, By Fano's inequality
By definition of entropy rate of a stationary process
By Fano's inequality
By the data processing inequality
By the memoryless of the channel
Hence by letting n tends to infinity, we have tends to 0.
Note: The data compression theorem is a consequence of the AEP, which shows the there exists a small subset (size of )of all possible source seuqences that contain most of the probability and that we can therefore represent the source with a small probability of error using H bits per symbol. The transmission theorem is based on the joint AEP; it uses the fact that for long block lengths, the output sequence of the channel is very likely to be jointly typical with probability . Hence we can usse about .
Comments (0)
You don't have permission to comment on this page.