Random code[1] 

Description

Code whose construction is non-deterministic in some way, i.e., codes that utilize an elements of randomness somewhere in their construction. Members of this class range from fully non-deterministic codes, to codes whose multi-step construction is deterministic with the exception of a single step.

Typically, random codes are selected with uniform distribution from some ensemble of codes. For example, a random binary code is a set of \(2^{K}\) codewords with length \(N\) chosen uniformly from the ensemble of all \(2^N\) bit-strings. Each bit in the codeword is randomly chosen between 0 and 1 with equal probability. For another example, a random binary linear code is generated from a random chosen \(K\) generators of length \(N\), where each bit of the generators is randomly chosen between 0 and 1 with equal probability. Equivalently, a random binary linear code is defined by a randomly generated \(K\) by \(N\) generator matrix, where each entry is randomly chosen between 0 and 1 with equal probability.

In both of the above random code constructions, the ensemble size scales exponentially with \(N\). A common convention is to think of the resulting code constructions as effectively explicit (as opposed to random) in cases where the ensemble size is independent of \(N\) or even when the size scales polynomially with \(N\).

Rate

Typical random codes (TRC) or typical random linear codes (TLC) refer to codes in the respective ensemble that satisfy a certain minimum distance. The relative fraction of typical codes in the ensemble approaches one as \(N\) goes to infinity [1] (see also Ref. [2]). Asymptotically, given distance \(d\), the maximum rate for a TRC is given by \(R=\frac{1}{2}R_{GV}(\delta)\) where \(R_{GV}\) is the Gilbert–Varshamov (GV) bound \(R_{GV}=1-h(\delta)\), and \(h(\delta)=h(\frac{d}{n})\) is the binary entropy function. The maximum rate for a TLC is given by \(R=R_{GV}(d)\), meaning that TLCs achieve the asymptotic GV bound.

Decoding

Ball-collision decoding [3].Information set decoding (ISD) [4] and Finiasz and Sendrier (FS-ISD) decoding [5].

Realizations

Distributed storage systems [6].Classical and quantum cryptography based on the learning-with-errors problem, which is related to decoding a random linear code [7].Random codes can be used to realize secure computation [8].

Notes

Shannon's pioneering work [1] analyzes the distance distribution of the code given a rate. Given \(N\) and the rate \(R\), the minimum distance of a TRC is given by the GV bound \(d=N\delta_{GV}(2R)\), where \(\delta_{GV} = h^{-1}(1-R)\), \(0\le R \le 1\), and \(\delta_{GV}(x)=0\) for all other \(R\). For a TLC, the minimum distance is given by \(d=N\delta_{GV}(R)\).

Parent

Child

Cousins

  • Fountain code
  • Justesen code — The required inner codes are obtained by random sampling from the Wozencraft ensemble, whose length scales logarithmically with \(n\).
  • Goldreich-Sudan code
  • Low-density parity-check (LDPC) code — LDPC codes are often constructed non-determinisitically.
  • Generalized RS (GRS) code — Concatenations of GRS codes with random linear codes almost surely attain the GV bound [9].
  • Meir code
  • Random quantum code
  • Expander LP code — Expander lifted-product codes are quantum CSS codes that utilize short classical codes in their construction which need to satisfy some properties (Ref. [10], Lemma 10). It is shown that such codes exist, but they are not explicitly constructed. Such codes can be obtained by repeated random sampling or by performing a search of all codes of desired length. Nevertheless, since the length of the desired short codes does not scale with \(n\), this construction is effectively explicit.

References

[1]
C. E. Shannon, “A Mathematical Theory of Communication”, Bell System Technical Journal 27, 379 (1948) DOI
[2]
A. Barg and G. D. Forney, “Random codes: minimum distances and error exponents”, IEEE Transactions on Information Theory 48, 2568 (2002) DOI
[3]
D. J. Bernstein, T. Lange, and C. Peters, “Smaller Decoding Exponents: Ball-Collision Decoding”, Lecture Notes in Computer Science 743 (2011) DOI
[4]
A. Becker, A. Joux, A. May, and A. Meurer, “Decoding Random Binary Linear Codes in 2 n/20: How 1 + 1 = 0 Improves Information Set Decoding”, Lecture Notes in Computer Science 520 (2012) DOI
[5]
M. Finiasz and N. Sendrier, “Security Bounds for the Design of Code-Based Cryptosystems”, Lecture Notes in Computer Science 88 (2009) DOI
[6]
Yunfeng Lin, Ben Liang, and Baochun Li, “Priority Random Linear Codes in Distributed Storage Systems”, IEEE Transactions on Parallel and Distributed Systems 20, 1653 (2009) DOI
[7]
O. Regev, “On lattices, learning with errors, random linear codes, and cryptography”, Journal of the ACM 56, 1 (2009) DOI
[8]
H. Chen, R. Cramer, S. Goldwasser, R. de Haan, and V. Vaikuntanathan, “Secure Computation from Random Error Correcting Codes”, Advances in Cryptology - EUROCRYPT 2007 291 (2007) DOI
[9]
C. Thommesen, “The existence of binary linear concatenated codes with Reed - Solomon outer codes which asymptotically meet the Gilbert- Varshamov bound”, IEEE Transactions on Information Theory 29, 850 (1983) DOI
[10]
P. Panteleev and G. Kalachev, “Asymptotically Good Quantum and Locally Testable Classical LDPC Codes”, (2022) arXiv:2111.03654
Page edit log

Your contribution is welcome!

on github.com (edit & pull request)— see instructions

edit on this site

Zoo Code ID: random

Cite as:
“Random code”, The Error Correction Zoo (V. V. Albert & P. Faist, eds.), 2022. https://errorcorrectionzoo.org/c/random
BibTeX:
@incollection{eczoo_random, title={Random code}, booktitle={The Error Correction Zoo}, year={2022}, editor={Albert, Victor V. and Faist, Philippe}, url={https://errorcorrectionzoo.org/c/random} }
Share via:
Twitter | Mastodon |  | E-mail
Permanent link:
https://errorcorrectionzoo.org/c/random

Cite as:

“Random code”, The Error Correction Zoo (V. V. Albert & P. Faist, eds.), 2022. https://errorcorrectionzoo.org/c/random

Github: https://github.com/errorcorrectionzoo/eczoo_data/edit/main/codes/classical/properties/random.yml.