a. A 20 V dc voltage is measured by analog and digital multi-meters. The analog instrument is on
its 25 V range, and its specified accuracy is ± 2%. The digital meter has a 3½ digital display
and an accuracy of ± (0.6+1). Determine measurement accuracy in each case.
b. A digital frequency meter has a time base from 1 MHz clock generator frequency divided by
decade counters. Determine the measured frequency when a 1.512 kHz sine wave is applied
and the time base uses (a) six decade counters and (b) four decade counters.
Part a
Analog instrument:
Voltage error
error
Digital instrument for 20 V displayed on a 3 ½ digit display
1 Digit = 0.1 V
Voltage error = ± (0.6% of reading + 1 Digit)
= ± (1.2 V + 0.1 V)
= ± 0.22 V
error
Part b
(a) Using six decade counters:
Time base frequency (f1) =
Time base period (T1) =
Input frequency period (Ti) =
Cycles counted =
measured frequency on the display = 1.512 kHz.
(b) Using four decade counters:
Time base frequency (f2) =
Time base period (T2) =
Input frequency period (Ti)
Cycles counted =
Measured frequency on the display = 001.5kHz.
Comments
Upload answer plz Answer in progress is coming
Plz upload answer
Leave a comment