Abstract :
This paper concerns an axiomatic characterization of information measures of dimension k. The Shannon entropy is an information measure of dimension one. Directed divergence and information improvement are examples of 2-dimensional and 3-dimensional information measures. From these k-dimensional information measures, one can obtain all dimensions (including 1, 2, and 3, which have already proven useful) at once. We determine all the information measures that depend upon several discrete probability distributions, have the sum property, and satisfy the additivity property. It is shown that the additive k-dimensional information measures along with the sum property are essentially the linear combination of the Shannon entropies and Kerridge inaccuracies.