• DocumentCode
    1914031
  • Title

    Convolutional decoding using recurrent neural networks

  • Author

    Hamalainen, Ari ; Henriksson, Jukka

  • Author_Institution
    Nokia Res. Center, Espoo, Finland
  • Volume
    5
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    3323
  • Abstract
    We show how recurrent neutral network (RNN) convolutional decoders can be derived. As an example, we derive the RNN decoder for 1/2 rate code with constraint length 3. The derived RNN decoder is tested in Gaussian channel and the results are compared to results of optimal Viterbi decoder. Some simulation results for other constraint length codes are also given. The RNN decoder is tested also with the punctured code. It is seen that RNN decoder can achieve the performance of the Viterbi decoder. The complexity of the RNN decoder seems to increase only polynomially, while in Viterbi algorithm the increase is exponential. Also, the hardware implementation of the proposed RNN decoder is feasible
  • Keywords
    convolutional codes; decoding; error correction codes; recurrent neural nets; Gaussian channel; convolutional decoding; error correction codes; punctured code; recurrent neural networks; Artificial neural networks; Circuits; Convolution; Convolutional codes; Maximum likelihood decoding; Optical transmitters; Recurrent neural networks; Testing; Very large scale integration; Viterbi algorithm;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1999. IJCNN '99. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-5529-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.1999.836193
  • Filename
    836193