Title :
Reducing Algorithm Complexity for Computing an Aggregate Uncertainty Measure
Author :
Liu, Chunsheng ; Grenier, Dominic ; Jousselme, Anne-Laure ; Bossé, Eloi
Author_Institution :
Laval Univ.
Abstract :
In the theory of evidence, two kinds of uncertainty coexist, nonspecificity and discord. An aggregate uncertainty (AU) measure has been defined to include these two kinds of uncertainty, in an aggregate fashion. Meyerowitz et al. proposed an algorithm for calculating AU and validated its practical usage. Although this algorithm was proven to be absolutely correct by Klir and Wierman, in some cases, it remains too complex. In fact, when the cardinality of the frame of discernment is very large, it can be impossible to calculate AU. Therefore, based on Klir´s and Harmanec´s seminal work, we give some justifications for restricting the computation of AU(Bel) to the core of the corresponding belief function, and we also propose an algorithm to calculate AU(Bel), the F-algorithm, which reduces the computational complexity of the original algorithm of Meyerowitz et al. We prove that this algorithm gives the same results as Meyerowitz´s algorithm, and we outline conditions under which it reduces the computational complexity significantly. Moreover, we illustrate the use of the F-algorithm in computing AU in a practical scenario of target identification.
Keywords :
belief networks; computational complexity; uncertainty handling; F-algorithm; aggregate uncertainty measure; belief function; computational complexity; discord uncertainty; nonspecificity uncertainty; Aggregates; Collaborative work; Computational complexity; Councils; Decision support systems; Entropy; Gold; Measurement uncertainty; Research and development; Set theory; Aggregate uncertainty (AU); Dempster–Shafer (D–S) theory; computational complexity;
Journal_Title :
Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on
DOI :
10.1109/TSMCA.2007.893457