Now, imagine that ta lamp needs 100W to operate. Let’s consider two ways to archieve this.
- Let’s use very low voltage, 5V. The current needed to power a 100W lamp will be 100W / 5V = 20A.
- Let’s use very high voltage, 500V. The current needed to power a 100W lamp will be 100W / 500V = 0.2A.
Now let’s assume that each wire has a constant resistance, equal to 0.01ohm, or 10 milliohms, which is a quite low, but still nonzero. Voltage drop across one wire, from Ohm’s law, will be: U = I × R.
- In the first case (100W = 5V × 20A), voltage drop across one wire will be 20A × 0.01Ω = 0.2V. The power dissipated by this wire will be 0.2V × 20A = 4W, so transformer will need to supply 100W for the bulb + 8W for losses on two wires.
- In the second case (100W = 500V × 0.2A), voltage drop across one wire will be 0.2A × 0.01Ω = 0.002V, or 2 mV. The power dissipated by this wire equals to 0.002V × 0.2A = 0.0004W, or 0.4 mW. Transformer needs to supply 100W for the bulb + 0.8 mW for losses.
What can we notice is that:
- We increased the voltage 100 times (from 5V to 500V)
- The current needed decreased 100 times (from 20A to 0.2A)
- The voltage drop across the wire decreased 100 times (from 0.2V to 0.002V)
- The power loss decreased 10000 (100 × 100) times
Comments
Leave a comment