Abstract :
Information theory – in particular mutual information– has been widely used to investigate neural processing in various brain areas. Shannon mutual information quantifies how much information is, on average, contained in a set of neural activities about a set of stimuli. To extend a similar approach to single stimulus encoding, we need to introduce a quantity specific for a single stimulus. This quantity has been defined in literature by four different measures, but none of them satisfies the same intuitive properties (non-negativity, additivity), that characterize mutual information. We present here a detailed analysis of the different meanings and properties of these four definitions. We show that all these measures satisfy, at least, a weaker additivity condition, i.e. limited to the response set. This allows us to use them for analysing correlated coding, as we illustrate in a toy-example from hippocampal place cells.