A circuit is to be designed to keep the voltage across a 50Ω resistor constant when the supply voltage changes. Two resistors of resistances 100 Ω and 50 Ω are connected in series across a 110 V AC source. Determine the voltage across the 50 Ω resistor. If the supply voltage varies between 100 V and 120 V, what is the required range of variations in the 100 Ω resistor required to keep the voltage across the 50 Ω resistor constant?
According to ohm's law
"V=IR"
For a constant "I"
"V\\alpha" "R"
voltage across "50\\varOmega" resistor
= "\\frac {50}{150}\u00d7110"
=36.67"V"
When the supply voltage is 100"V" voltage across 100"\\varOmega"
="100-36.67"
="63.33 V"
When the supply voltage is "120V"
Voltage across "100\\varOmega"
="120-36.67"
="83.33"
Therefore the range is
"63.33V" to "83.33V"
Comments
Leave a comment