# Repetition Encoder Illustration

The following is an example of using redundancy to correct errors from transmission over a binary symmetric channel, illustrated with a drawing of a monkey. It was inspired by David J.C. MacKay's Information Theory, Inference, and Learning Algorithms.

## Method:

The use of repetition codes
is one simple method of improving communication over a noisy channel. Briefly, if your
hope is to transmit the following binary string, **s**:

s= 001000110111

over an imperfect channel that flips bits with probability, *f*, you can use
repetition codes to improve your chances of ultimately transmitting the correct
message. You simply repeat each bit a certain number of times. Using a repetition
code R_{3}, meaning 3 repeats per bit, the transmitted string **t** is:

t= 000 000 111 000 000 000 111 111 000 111 111 111

The imperfect channel adds noise vector **n**, resulting in received string, **r**.
You can then simply assume that the bit with the highest frequency in the triple
you receive is the one truly represented in your original string, **s**. This is called
majority logic decoding.

The table below shows the original signal (**s**), the redundantly encoded transmitted signal (**t**),
the noise added by the channel (**n**), the received signal (**r**=**t**+**n**), and the estimated signal based
on majority logic decoding. Stars indicate which errors were fixed by the encoding,
and which errors were not fixed.

s = 0 0 1 0 0 0 1 1 0 1 1 1 t = 000 000 111 000 000 000 111 111 000 111 111 111 n = 010 000 000 110 000 000 000 001 000 000 000 010 r = 010 000 111 110 000 000 111 110 000 111 111 101 est = 0 0 1 1 0 0 1 1 0 1 1 1 FIXED * * * NOT *

## Results:

I implemented this algorithm in Python for illustration purposes, and the code
is available here. A binary
string (representing the above black-and-white monkey drawing) was transmitted across
a binary symmetric channel with error rate *f* = 0.1. I varied the number of
repetitions from 1 to 19, and then decoded the message as described above. You can see
that the errors are reduced with the higher number of repetitions, but at cost to the
rate of information transfer.

Below are the inferred, decoded images that had been encoded with various numbers of repetions.
All had error probability, *f* = 0.1.

### R_{1} (No repetitions)

Percent Error: 10.041%

### R_{3}

Percent Error: 2.738%

### R_{5}

Percent Error: 0.868%

### R_{7}

Percent Error: 0.302%

### R_{9}

Percent Error: 0.087%

### R_{11}

Percent Error: 0.0278%

### R_{13}

Percent Error: 0.007%

### R_{15}

Percent Error: 0.002%

### R_{17}

Percent Error: 0.002%

### R_{19}

Percent Error: 0.00%