I saw a post like this fly by on my Facebook feed today. I lost it when I closed the window, but the question lingered. Why is a Transformer rated in kVA, not in kW? I did not know the answer, but I should know because I have also seen kVA rating used in datacenter contract rather than kW.
Many electricians point out that kVA represents apparent power, and kW is the available amount of work under load. The difference is the power factor representing efficiency, so kW = kVA × pf. In a perfect system, kVA = kW. And electricians explain that power factor is less than 1 due to transmission loss.
But someone also pointed out that for DC (direct current), kVA = kW because the current does not go out of phase (not a great explanation though). That reminded me from college physics class that AC (alternative current) is a sine wave, and its RMS (root mean squared) voltage is smaller than the peak voltage. For sine, rmsV = (√2 / 2) × Vp ≅ 0.707 Vp. RMS voltage is used to calculate work, not the peak voltage.
As a side note, AC in the real world only approximates a sine wave, and UPS modified sine wave can look much worse.
So the answer is this: transformer is rated in kVA because it has to handle peak voltage, but the usable work kW is rated in RMS voltage which would under-specify the design requirements of a transformer.
Here is an afterthought: when physics gets applied in the real world, the electricians learn a simplified version of it stated to them as a matter of fact, which is not wrong, but they do not understand the reasoning behind it. Some people are perfectly happy to learn just enough to get stuff done. But if you want to keep innovating, you have to understand the reasoning behind an existing system, or else you would reinvent the wheel poorly by not learning from past mistakes and making additional mistakes that people before you knew to avoid.
On that topic, I recommend watching this video by SmarterEveryDay - I Was SCARED To Say This To NASA... (But I said it anyway).
No comments:
Post a Comment