Abstract :
When ITU-T standardized SONET/SDH in 1988, the highest speed was 156 Mb/s. The rapid growth of the Internet, as well as other factors, has increased the necessary speed of today´s networks to 10 Gb/s or 40 Gb/s (10G, 40G). In 2000, ITU-T Recommendation G.783 specified the maximum jitter for error-free communications on 10G and 40G networks, but there was not a standard process to verify the accuracy of jitter measurements and no reference source with a known jitter amount. Accurate jitter measurements at high speed are difficult because of the small times involved, but they are essential for implementing error-free networks. The paper describes a method which shows that accurate jitter measurements at 10G and above must correctly account for both pattern-dependent and random jitter. Pattern-dependent jitter is caused to a much greater extent by the transmitter than by the receiver. Incorrect assumptions about the source of pattern-dependent jitter can result in large inconsistencies in jitter measurements at high speed. The method outlined solves those problems.
Keywords :
SONET; calibration; electric variables measurement; jitter; optical fibre networks; standardisation; synchronous digital hierarchy; telecommunication equipment testing; 10 Gbit/s; 156 Mbit/s; 40 Gbit/s; ITU-T Recommendation G.783; SONET/SDH standardization; calibration standards; jitter measurements; pattern-dependent jitter; random jitter; Amplitude modulation; Calibration; Jitter; Measurement standards; Optical modulation; Optical receivers; Optical transmitters; SONET; Signal generators; Time measurement;