Analysis of recursive convolutional codes and turbo codes as sources with memory

Authors: Valeriu Munteanu, Daniela Tarniceriu, Lucian Trifina

Abstract:
The paper proposes a general method to analyze discrete sources with memory. Besides the classical entropy, we define new information measures for discrete sources with memory, similar to the information quantities specific to discrete channels. On the base of this method, we show for the first time that, as result of convolutional and turbo encoding, sources with memory are obtained. We apply this information analysis method for the general case of a recursive convolutional encoder of rate RCC = 1/n0 and memory of order m, and for a turbo encoder of rate RTC = 1/3, with two systematic recursive convolutional component encoders. Each component encoder has memory of order m, and is built based on the same primitive feedback polynomial. For the convolutional and turbo codes, the information quantities H(Y/S), H(S,Y), H(S/Y), H(Y), H(S) and I(S,Y) have been computed, where S and Y denote the set of states and the set of messages of the encoder, respectively. The analysis considered two cases: n0 ≤ m + 1 and n0 > m + 1. When n0 = m + 1, the mutual information I(S,Y) is maximum and equal to m, as is the entropy of the set of states. For turbo codes, the quantity I(S,Y) also depends on the input bit and on its probability.

Keywords:
Markov sources
Convolutional codes
Turbo codes
Entropies

Published in: AEÜ-International Journal of Electronics and Communications (Volume 67, Issue 5, May 2013)

Publisher: Elsevier

ISSN Information: 1434-8411

Analysis of recursive convolutional codes and turbo codes as sources with memory

Bình luận của bạn
*
*
*
*
 Captcha

Logo Bottom

Địa chỉ: 268 Lý Thường Kiệt, P.14, Q.10, TP.HCM           Tel: 38647256 ext. 5419, 5420           Email: thuvien@hcmut.edu.vn

© Copyright 2018 Thư viện Đại học Bách khoa Tp.Hồ Chí Minh 

Thiết kế website Webso.vn