# 1.41 Consider an N-bit DAC whose output varies from 0 to VFS (where the subscript FS denotes “full-scale”). (a)Show that a change in the least significant bit (LSB) induces a change of VFS/(2N − 1) in the output. This is the resolution of the converter. (b)Convince yourself that the DAC can produce any desired output voltage between 0 and VFS with at most VFS/2(2N − 1) error (i.e., one-half the resolution). This is called the quantization error of the converter. (c)For VFS = 5 V, how many bits are required to obtain a resolution of 2 mV or better? What is the actual resolution obtained? What is the resulting quantization error?

Solved803 views

1.41 Consider an N-bit DAC whose output varies from 0 to VFS (where the subscript FS denotes “full-scale”).

(a)Show that a change in the least significant bit (LSB) induces a change of VFS/(2N − 1) in the output. This is the resolution of the converter.
(b)Convince yourself that the DAC can produce any desired output voltage between 0 and VFS with at most VFS/2(2N − 1) error (i.e., one-half the resolution). This is called the quantization error of the converter.
(c)For VFS = 5 V, how many bits are required to obtain a resolution of 2 mV or better? What is the actual resolution obtained? What is the resulting quantization error?

Question is closed for new answers. 