I’m thinking their idea is that they want to boost the line voltage to compensate for I2R loses caused by peak demand. The only way they can do that without changing the infrastucture is to run their alternators at a higher speed, which up’s the line frequency. This doesn’t affect most electronic devices because they use various DC power supplies, along with internal timebases.
So the errors would always be in the same direction (clocks run fast) and would be cumulative?
***Im thinking their idea is that they want to boost the line voltage to compensate for I2R loses caused by peak demand. The only way they can do that without changing the infrastucture is to run their alternators at a higher speed, which ups the line frequency.***
Can’t do it on a generator. They are designed to run at only 60 hz. To boost voltage you increase the magnetism in the generator rotor and the Hz never varies.
In summer you have lots of VARs (volt amp reactive) due to the increase of induction motors on the line. To reduce the VAR load the dispacher can add capacitors to the line load reducing the VAR load. When he runs out of capacitors then the VAR load will increase no matter what you do, even to the point of possibly melting down the 3 phase power lines.
Almost the entire load base will increase its power consumption if the line voltage increases. The only exceptions are (1) those loads with automatic voltage regulators based on magnetic principles, and (2) those loads powered by switching power supplies. Interestingly, in these two classes of loads, the current consumption goes down with increasing voltage, holding the power consumption nearly constant. The power companies would love it if all their loads were this way, but that will never be.
The vast majority of the load base would respond to increased line voltage with increased power draw and decreased life.
The only way they can do that without changing the infrastucture is to run their alternators at a higher speed, which ups the line frequency. This doesnt affect most electronic devices because they use various DC power supplies, along with internal timebases.
Upping the line frequency does not have any direct effect on either the line voltage or the power consumption of the loads on that line, as long as it is kept within limits, say (for North America) 57 to 63 Hz.
It is true that with the advent of computerized everything, fewer and fewer line-powered devices depend on an exact frequency or voltage these days. Electronic appliances (or their power supplies) often work from 100 to 240 VAC and from 47 to 63 Hz, so if they have any need for a time base, they get off an internal crystal oscillator, or a WWV or GPS receiver.
But I should emphasize that TVs, computers, and smartphones are still a small fraction of the entire demand. Think motors and furnaces in steel mills, and HVAC loads everywhere.