IT, Telecomunications, Wireless

What is information theory?

The information theory is a branch of mathematics which deals with information and communication. In this post, will be shown some information theory concepts and it’s importance to telecommunications.

Information and entropy

What is information? In this theory, Shannon information is the measure of uncertainly degree in a symbol or message. When bigger the uncertainly, bigger the Shannon information. This is the relation between information I and probability. The Shannon information can’t be confused with information in the common sense, which is uncertainly reduction.

I=-log_2P(x)

P(x) is the probability of message x’s transmission. If a transmitter can emit only one type of message, exists the probability of 100 % to receiver get this message (P(x)=1). Therefore, the Shannon information is 0 bit.

I=-log_21

I=0

If was launched a coin only once, there is 50 % of probability to get heads or tails (P(x)=0,5). In this case, will have 1 bit of Shannon information.

I=-log_2\frac{1}{2}

I=1

Considerate three buckets with 4 balls each. The bucket 1 which only red balls has less entropy than bucket 2, because in bucket 1 we know we will only get the red ball. By it`s turn, bucket 2 has less entropy than bucket 3. Therefore, more knowledge about the system, less the entropy.

The entropy is sum of information multiplied by probability of all symbols and calculated in this form.

H=\sum_{I=1}^{n} p(x_i)\cdot I(x_i)

This is the graphic of entropy in function of probability. The entropy is measured in bit per symbol or per message. This concept is important because shows the theoretical limit of how many bits a symbol can have and it is used the data compression to save space.

Channel capacity

The message needs to be transmitted in a noisy environment and be reproduced without distortion. Every communication system has a channel with maximum capacity and every channel has noise.

The fundamental Shannon theorem says: If the transmission rate is lower or equal than channel capacity, can use a error corrector code to obtain an arbitrary low error rate. However, if the transmission rate is bigger than the capacity, the transmission will always have errors, don`t matter the corrector code. This equation calculate the maximum capacity in a channel in bits per second.

C=B\cdot log_2(1+SNR)

Where B is the band width in Hz and SNR is the signal-noise rate.

About Pedro Ney Stroski

Leave a Reply

Your email address will not be published. Required fields are marked *