DocumentCode
659142
Title
Information-preserving Markov aggregation
Author
Geiger, Bernhard C. ; Temmel, Christoph
Author_Institution
Signal Process. & Speech Commun. Lab., Graz Univ. of Technol., Graz, Austria
fYear
2013
fDate
9-13 Sept. 2013
Firstpage
1
Lastpage
5
Abstract
We present a sufficient condition for a non-injective function of a Markov chain to be a second-order Markov chain with the same entropy rate as the original chain. This permits an information-preserving state space reduction by merging states or, equivalently, lossless compression of a Markov source on a sample-by-sample basis. The cardinality of the reduced state space is bounded from below by the node degrees of the transition graph associated with the original Markov chain. We also present an algorithm listing all possible information-preserving state space reductions, for a given transition graph. We illustrate our results by applying the algorithm to a bi-gram letter model of an English text.
Keywords
Markov processes; data compression; entropy; signal processing; English text; Markov source; bigram letter model; entropy rate; information preserving Markov aggregation; information preserving state space reduction; lossless compression; noninjective function; second order Markov chain; transition graph; Biological system modeling; Computational modeling; Entropy; Markov processes; Merging; Partitioning algorithms; Signal processing algorithms; Markov chain; lossless compression; model order reduction; n-gram model;
fLanguage
English
Publisher
ieee
Conference_Titel
Information Theory Workshop (ITW), 2013 IEEE
Conference_Location
Sevilla
Print_ISBN
978-1-4799-1321-3
Type
conf
DOI
10.1109/ITW.2013.6691265
Filename
6691265
Link To Document