Some known properties regarding the concept of "almost total" information convergence of order

for an arbitrary sequence of distributions are defined and reviewed. A theorem is then proved showing that in the case of stochastic point processes almost total information convergence is equivalent to two simple conditions,