Right, but, the point is if you have a motor chosen for an application-and there are an almost infinite number of different applications-and you’ve chosen one that runs at 1770 RPM because that is exactly what you need for that application and everything involved is set for that requirement, with this “experiment” that could apparently vary at any given moment-I think. It seems to me its more complicated than a bunch of messed up clocks.
I think it can get a lot more spectacular than messed up clocks. If the entire interconnect is not kept within a few percent of synchronized, you could end up with destroyed generators at power plants. Common problem with home generators, if not equipped with proper disconnect, grid power comes back on and the home generator is destroyed because it was not synchronized with the grid. I doubt that there are many large (megawatt) gensets sitting in warehouses waiting to be installed, and even those that are in warehouses, are going to take days to install.
I think it would depend on how fast the frequency could fluctuate and how far. The physics of a large ensemble of generators feeding the grid place considerable physical constraints on these two quantities.
If the claimed limit of 14 seconds per day is a reliable indication, that works out to 162 parts per million. Unless your motor-driven industrial system is doing audio or video work, that amount of deviation is probably trivial.
This brings to mind the time in 1967 I was watching a TV network show originating, on film, from New York City. Suddenly, the image went dim, slowed down noticeably as was evident from the sound track, and then went dark. A few seconds later the local station put up a network trouble slide. This was the great 1967 northeast blackout. The projector was equipped, as most are, with a synchronous motor, so I was actually seeing, for a second or two, the frequency of the entire northeast grid sagging well below 60 Hz before going away completely.