Abstract :
For a dynamical system far from equilibrium, one has to deal with empirical probabilities defined through time averages, and the main problem then is how to formulate an appropriate statistical thermodynamics. The common answer is that the standard functional expression of Boltzmann–Gibbs for the entropy should be used, the empirical probabilities being substituted for the Gibbs measure. Other functional expressions have been suggested, but apparently with no clear mechanical foundation. Here, it is shown how a natural extension of the original procedure employed by Gibbs and Khinchin in defining entropy, with the only proviso of using the empirical probabilities, leads to a functional expression for the entropy which is in general different from that of Boltzmann–Gibbs. In particular, the Gibbs entropy is recovered for empirical probabilities of the Poisson type, while the Tsallis entropies are recovered for a deformation of the Poisson distribution.