Analog and Digital Conversion/ADC Converters
ADC Converter Steps
editAn Analog to Digital Converter (ADC) takes an analog input signal and converts the input, through a mathematical function, into a digital output signal. While there are many ways of implementing an ADC, there are three conceptual steps that occur:
- The anti-aliasing filter blocks unwanted frequencies.
- The signal is sampled.
- The sampled signal is quantized.
- The quantized signal is digitally coded.
Flash Converters ADC
editFlash ADC's are the fastest ADC as it takes only on clock pulse to produce digital output. It is a direct compare type architecture. It requires 2^n-1 comparator, 2^n Resistor and a thermometer to binary encoder.
Successive Approximation ADC
editSuccessive approximation is one of the most widely used popular method due its efficiency The block diagram of SAR ADC s as shown:
Working: Initially, let us set the MSB bit of SAR register i.e. d1 = 1. It is applied to 4-bit D to A converter i.e. as 1000. The D/A converter will generate its analog value and send to control logic The output of control logic is VR. Now at the comparator, there are two inputs VR (Reference voltage) and VA (Input analog). If VR > VA output of comparator is low and bit d1 is reset correspondingly. If VR <VA output of comparator is high and bit d1 remains high. The same procedure is repeated for all bits i e for d2, d3,…. dn, and output may be taken in serial or parallel manner. Advantage: The conversion time is fixed as it does not depend upon amplitude of analog input.
Sigma-Delta ADC
editIn systems that requires at least 16 bits of resolution, such as audio electronics, by far the most common type of ADC converter is the sigma-delta converter.
The sigma-delta converters, unlike other kinds of ADCs, never sample the signal at any particular point in time. Instead, the output digital number is a weighted average of the signal over the entire sampling time.