The Complexity of Hard-Decision Decoding Linear Codes: A. E. Kroukt A

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

ISlT 1997, Ulm, Germany, June 29 - July 4

The Complexity of Hard-Decision Decoding of Linear Codes


A. Barg*l E. Kroukt H. C. A. vanTilborgt
‘Bell Laboratories 2C-375 SSt. Petersburg State Academy tDept. of Mathematics and Computer Science
700 Mountain avenue of Aerospace Instrumentation Eindhoven University of Technology
Murray Hill, N J 07974 Bol’shaja Morskaja 61 5600 MB Eindhoven
USA 190000 St. Petersburg, Russia The Netherlands

Abstract - We study a general method of mini- length y. For every placement of the y-segment, we isolate a
mum distance decoding of linear codes that instead +
linear code C(xly) of length x y with y parity checks, whose
of decoding the original code recovers the transmit- panty-check matrix is formed by the corresponding rows and
ted codeword by a number of decodings of shortened columns of H. This code is decoded using the decoding algo-
codes. We present an implementation of this method rithm of [3]. This decoding supplies us with a list of candi-
whose complexity for long linear codes has the small- dates for the message set of C. The final list of candidates for
est known value for any code rate R, 0 < R < 1. a chosen partition is formed from those message vectors that
appear as decoding results of C(z(y) for at least two different
Minimum distance decoding is the most powerful decod- placements of the y-segment.
ing method from the point of view of transmission reliability. Let €(a,R) = (1 - R)[1- H2(=)]. The asymptotic com-
Its applicability is hindered by high implementation complex- plexity of the algorithm is determined by the following theo-
ity. Even in the hard-decision setting all known algorithms rem.
have complexity that grows exponentially with the length of
the code. Implementations of this decoding include brute- Theorem 1 For almost all linear codes, the decoding algo-
force methods such as successive inspection of all codewords, rithm studied here performs maximum likelihood decoding. Its
building up and storing the syndrome table (or the syndrome sequential implementation has complezity qnr(R)(lto(l)). For
trellis). We study the worst-case complexity of decoding al- q = 2 the function y(R) has the form
gorithms which is measured either as the number of computer
operations (time complexity) or the size of memory used for
the decoding (space complexity). Thus, for an [n, k, d] code
the decoding complexity does not exceed O(n(min(qk,q ” - k ) ) .
The most efficient implementation of minimum distance de-
coding suggests to successively encode groups of k coordinates
in the received vector y that correspond to information sets of 0
the code and choose the codeword c closest to y. General al- The function yq(R) is also immediate but requires a few more
gorithms that have the smallest known asymptotic complexity lines. Computations show that this function improves the best
[5], [2], [4] are all based on this idea. known result [4] by a small but finite value for any q 2 2 and
After briefly commenting on these methods we discuss a all code rates R E ( 0 , l ) .
new approach. The idea is to perform a number of decodings
of supercodes of the original code C, i.e., linear codes C’ with REFERENCES
C c C’. Decoding a supercode amounts to restricting one- [ l j V.M. Blinovskii, “Lower asymptotic bound on the number of
self to a subset of parity checks of C, i.e., building a list of linear code words in a sphere of given radius in Fg,” Problems
candidates based on a part of the received syndrome. This of Info. Trans., 23 ( 2 ) (1987), 50-53 (in Russian) and 130-132
idea proves efficient for short codes allowing us to construct (English translation).
reduced syndrome tables. We work out an example for the [2] J. T. Coffey and R. M. F. Goodman, “The complexity of infor-
[48,24] code. Decoding up to 5 errors requires the memory of mation set decoding,” IEEE Trans. Inform. Theory, IT-35 (5)
about 8K and about 3000 computer operations. (1990), 1031-1037.
Asymptotic results of our work are established for almost [3] I. Dumer, “TWO decoding algorithms for linear codes,” Problems
a l l linear codes except for a fraction of codes that decays expo- of Info. Trans., 25 (1) (1989), 24-32 and 17-23.
nentially as the code length n grows. Let C be an [n, k ] linear [4] -, “On minimum distance decoding of linear codes,”
code and suppose k f n + R as n + 00. Let H be the parity- Proc. 5th Joint Soviet-Swedish Int. Workshop Inform. Theory,
check matrix of C. We restrict ourselves to correcting all coset MOSCOW(1991), pp. 50-52.
leaders of weight up to nGo(R), where &(R) = H Y 1 ( l - R). [5] E.A. Krouk, “Decoding complexity bound for linear block
By [l], this is sufficient for the maximum likelihood decoding codes,” Problems of Info. Trans., 25 ( 3 ) (1989), 103-107 and
for almost all long codes. 251-254.
Each iteration of our algorithm consists of the follow-
ing steps. First, we choose a random partition of the set
{ 1 , 2 , . . .,n} into subsets of size x , k - 2, and n - k. Then
the last subset is partitioned into s = [(n - k ) / y l segments of

‘Research done while at Dept.of Mathematics and Computer


Science, Eindhoven University of Technology, Eindhoven, The
Netherlands.

01997 I E EE
0-7803-3956-8/97/$10.00 331

You might also like