DocumentCode :
1449699
Title :
A digital background calibration technique for time-interleaved analog-to-digital converters
Author :
Fu, Daihong ; Dyer, Kenneth C. ; Lewis, Stephen H. ; Hurst, Paul J.
Author_Institution :
California Univ., Davis, CA, USA
Volume :
33
Issue :
12
fYear :
1998
fDate :
12/1/1998 12:00:00 AM
Firstpage :
1904
Lastpage :
1911
Abstract :
A 10-bit 40-Msample/s two-channel parallel pipelined ADC with monolithic digital background calibration has been designed and fabricated in a 1 μm CMOS technology. Adaptive signal processing and extra resolution in each channel are used to carry out digital background calibration. Test results show that the ADC achieves a signal-to-noise-and-distortion ratio of 55 dB for a 0.8-MHz sinusoidal input, a peak integral nonlinearity of 0.34 LSB, and a peak differential nonlinearity of 0.14 LSB, both at a 10-bit level. The active area is 42 mm2, and the power dissipation is 565 mW from a 5 V supply
Keywords :
CMOS integrated circuits; adaptive signal processing; analogue-digital conversion; calibration; integrated circuit noise; parallel processing; pipeline processing; 1 micron; 10 bit; 5 V; 565 mW; CMOS technology; adaptive signal processing; analog-to-digital converters; digital background calibration technique; parallel pipelined ADC; time-interleaved ADC; two-channel ADC; Adaptive signal processing; Analog-digital conversion; CMOS technology; Calibration; Interleaved codes; Multiplexing; Power dissipation; Signal resolution; Signal sampling; Testing;
fLanguage :
English
Journal_Title :
Solid-State Circuits, IEEE Journal of
Publisher :
ieee
ISSN :
0018-9200
Type :
jour
DOI :
10.1109/4.735530
Filename :
735530
Link To Document :
بازگشت