Author_Institution :
Philips Healthcare, El Paso, TX, USA
Abstract :
In many practical situations, we have only partial information about the probabilities. In some cases, we have crisp (interval) bounds on the probabilities and/or on the related statistical characteristics. In other situations, we have fuzzy bounds, i.e., different interval bounds with different degrees of certainty. In a situation with uncertainty, we do not know the exact value of the desired characteristic. In such situations, it is desirable to find its worst possible value, its best possible value, and its “typical” value - corresponding to the “most probable” probability distribution. Usually, as such a “typical” distribution, we select the one with the largest value of the entropy. This works perfectly well in usual cases when the information about the distribution consists of the values of moments and other characteristics. For example, if we only know the first and the second moments, then the distribution with the largest entropy if the normal (Gaussian) one. However, in some situations, we know the entropy (= amount of information) of the distribution. In this case, the maximum entropy approach does not work, since all the distributions which are consistent with our knowledge have the exact sam e entropy value. In this paper, we show how the main ideas of the maximum entropy approach can be extended to this case.
Keywords :
entropy; fuzzy set theory; probability; entropy constraints; fuzzy bounds; maximum entropy techniques extension; probability distribution; statistical characteristics; Computer science; Economic forecasting; Entropy; Medical services; Probability density function; Probability distribution; Uncertainty;