Abstract :
The conventional input-output relation for linear sampled-data systems whose output is sampled at an integer multiple of the input sampling rate is shown to give incorrect results if the system transfer function contains time delays that are integer multiples of the basic sampling interval T. The correct input-output relation is developed.