Tu banner alternativo

Code rate

In this article we are going to delve into the fascinating world of Code rate. From its origins to its relevance today, this topic has captured the attention of researchers, academics, and enthusiasts alike. Over time, Code rate has played a crucial role in various aspects of society, from culture to economics. Through detailed analysis, we will explore the different facets of Code rate, unraveling its impact and relevance in the contemporary world.

Tu banner alternativo
Different code rates (Hamming code).

In telecommunication and information theory, the code rate (or information rate[1]) of a forward error correction code is the proportion of the data-stream that is useful (non-redundant). That is, if the code rate is for every k bits of useful information, the coder generates a total of n bits of data, of which are redundant.

If R is the gross bit rate or data signalling rate (inclusive of redundant error coding), the net bit rate (the useful bit rate exclusive of error correction codes) is .

For example: The code rate of a convolutional code will typically be 12, 23, 34, 56, 78, etc., corresponding to one redundant bit inserted after every single, second, third, etc., bit. The code rate of the octet oriented Reed Solomon block code denoted RS(204,188) is 188/204, meaning that 204 − 188 = 16 redundant octets (or bytes) are added to each block of 188 octets of useful information.

A few error correction codes do not have a fixed code rate—rateless erasure codes.

Note that bit/s is a more widespread unit of measurement for the information rate, implying that it is synonymous with net bit rate or useful bit rate exclusive of error-correction codes.

See also

References

  1. ^ Huffman, W. Cary, and Pless, Vera, Fundamentals of Error-Correcting Codes, Cambridge, 2003.