Namespaces
Variants
Actions

Berlekamp-Massey algorithm

From Encyclopedia of Mathematics
Revision as of 16:55, 7 February 2011 by 127.0.0.1 (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

A procedure for solving an equation of the form

for the polynomials , , , given the polynomial , . Originally intended for the decoding of certain cyclic error-correcting codes (cf. also Error-correcting code), it has since found wide application in unrelated areas.

The decoding of cyclic error-correcting codes defined over a finite field, such as Bose–Chaudhuri–Hocquenghem and Reed–Solomon codes, can be achieved by solving the above equation, referred to as the key equation [a1] in this context, where the minimum distance of the code is at least . For cyclic codes of length over the finite field with elements, , codewords are viewed as -tuples over , and also naturally identified with polynomials over of degree at most . Let be a primitive -th root of unity in an extension field of . The coordinate positions of the code are identified with the distinct powers of . Cyclic codes with designed distance consist of all codeword polynomials having a common set of consecutive powers of as zeros, the zero set of the code. The polynomial is the syndrome polynomial whose th coefficient , , is the evaluation of the received word, at the th element of the zero set of the code. The received word is the sum of the transmitted codeword with an error word, over , whose non-zero coordinate positions correspond to error locations. It is assumed there are at most coordinate positions in error. The decoding process retrieves the transmitted codeword. The unique solution to the key equation gives , the error locator polynomial, and , the error evaluator polynomial. The zeros of the error evaluator polynomial yield the coordinate positions at which errors occur and the error value at such an error position is given by , where indicates the formal derivative.

E.R. Berlekamp [a1] gave a computationally efficient iterative algorithm to solve the key equation for the two polynomials. J.L. Massey [a4] showed that this algorithm gives a general solution to the problem of synthesizing the shortest linear feedback shift register that generates the syndrome sequence. At each stage the algorithm incrementally adjusts the length of the shift register and the feedback multiplier taps to generate the syndrome sequence to that point and terminates when the whole sequence is generated. In this formulation, the set of shift register feedback tap values yields the coefficients of the error locator polynomial. The iterative algorithm of Berlekamp and the feedback shift register synthesis interpretation is known as the Berlekamp–Massey algorithm.

The solution to the key equation, and hence the Berlekamp–Massey algorithm, has connections to several other algorithms, most notably the extended Euclidean algorithm [a5], [a3] and continued fractions [a6] (cf. also Continued fraction). The extended Euclidean algorithm iteratively determines the greatest common divisor of two elements in a Euclidean domain, in this case . Specifically, for , define and and define a sequence of quotient and remainder polynomials by

The degrees of the remainder polynomials are strictly decreasing and the last non-zero remainder polynomial is the greatest common divisor of , . The quotient polynomials can be used to define a sequence of subsidiary polynomials and with strictly increasing degrees, such that at each stage of the algorithm

so that for all . Applied to the case and , since the have strictly decreasing degrees and the strictly increasing degrees, it can be shown that the algorithm reaches a unique point where , yielding a solution to the key equation.

Recasting the key equation as

, the problem is to determine the smallest degree , of degree at most , such that the first terms of the expansion of the fraction about the origin yields the sequence of syndromes. Such a partial fraction expansion leads naturally to an equivalent continued fraction interpretation of the Berlekamp–Massey algorithm [a6].

The Berlekamp–Massey algorithm has found numerous applications outside of the coding context. These include fast algorithms in numerical analysis such as Lanczos recursion and Levinson–Shur algorithms for Toeplitz matrices, as well as problems of minimal realizations in system theory [a2].

References

[a1] E.R. Berlekamp, "Algebraic coding theory" , McGraw-Hill (1968)
[a2] T. Kailath, "Encounters with the Berlekamp–Massey algorithm" R.E. Blahut (ed.) D.J. Costello Jr. (ed.) U. Maureer (ed.) T. Mittelholzer (ed.) , Communications and Cryptography, Two Sides of One Tapestry , Kluwer Acad. Publ. (1994)
[a3] R.J. McEliece, "The theory of information and coding" , Encycl. Math. Appl. , 3 , Addison-Wesley (1977)
[a4] J.L. Massey, "Shift register synthesis and BCH decoding" IEEE Trans. Inform. Th. , IT-19 (1969) pp. 122–127
[a5] Y. Sugiyama, S. Kasahara, S. Hirasawa, T. Namekawa, "A method for solving key equation for decoding Goppa codes" Inform. Control , 27 (1975) pp. 87–99
[a6] L.R. Welch, R.A. Scholtz, "Continued fractions and Berlekamp's algorithm" IEEE Trans. Inform. Th. , IT-25 (1979) pp. 19–27
How to Cite This Entry:
Berlekamp-Massey algorithm. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Berlekamp-Massey_algorithm&oldid=11537
This article was adapted from an original article by Ian F. Blake (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article