First of all, there exist two types of problem.
1. Error detection. u --> x --> y -->y = x ? with high probability
just need know whether the error happens or not
2.Error correction. u--> x --> y -->
=u with high probability
not only need search the error, but also correct it.
Begin with q-ary symmetric channel.
Here q is large compared with n => q =
the cardinality of x => {0,1,2,3...q-1]
Information Theory => average case.(small Pe)
Coding Theory => worst case (Pe = 0)
you transmit x to y
you have the probability
(no error at i th bit)

If an error happened in the transmission , 
Information Theory
\ m-l bits / \ l bits /
in the m bits of code, there exist m-l bits of information and l bits of parity code each symbol is a n bits vector.
We need l=o(m) (l<<m) and
is small
let l= 2log n, then 
Prove that with probability (1-
), an error will be detected.
d
Information Theory
Ex2 If q is sufficient large prove Pr(detect error) =>1
Coding Theory
How to decode the code
the distance of x and x' :
. To correct the error we need 
Step
1.
hamming distance : the number of locations in which 
2. Given a code {X}
min-distance
where x and x'
3.correct errors the minimum-distance 
to detect error we just need np+1
=> singleton bound (the min distance =d, If we delete d-1 bits, in the remaining parts, the codes still are unique )
n-d+1 delete d-1
nR =n-d+1
Coding theory
With probability (1-p), an error will be detected.
With probability (1-2p), an error will be corrected.
Repetition code
for k =1 => n , R=k/n =1/n
error decode : only over n/2 bits have errors.
cons: rate of the code is too low.
hamming code
For example
n=7 k=4 d=3

in each circle, the number of '1' should be even.
the max error corrected = 1
error example
we have
1.a codes [n1 n2 n3 n4 d1 d2 d3] = [1111011]
only in the circle 1( contain d1), the number of '1' =odd number => d1 is an error
Comments (0)
You don't have permission to comment on this page.