Essential properties of entropies and other measures of information are arrived at on the basis of the noiseless coding theorems, source entropies, and Huffman codes (and also on the basis of forecasting and experiments). Conversely, characterization theorems are given based on these properties. Some of the most important are Shannon\´s inequality, boundedness on an interval, subadditivity, additivity, branching, and expansibility. Also entropies of mixed probabilistic and nonprobabilistic character and convex

-divergences are mentioned, among others. Some unsolved problems are stated.