I think it would depend on how fast the frequency could fluctuate and how far. The physics of a large ensemble of generators feeding the grid place considerable physical constraints on these two quantities.
If the claimed limit of 14 seconds per day is a reliable indication, that works out to 162 parts per million. Unless your motor-driven industrial system is doing audio or video work, that amount of deviation is probably trivial.
This brings to mind the time in 1967 I was watching a TV network show originating, on film, from New York City. Suddenly, the image went dim, slowed down noticeably as was evident from the sound track, and then went dark. A few seconds later the local station put up a network trouble slide. This was the great 1967 northeast blackout. The projector was equipped, as most are, with a synchronous motor, so I was actually seeing, for a second or two, the frequency of the entire northeast grid sagging well below 60 Hz before going away completely.
Thats an interesting story about 1967.