Bingo! Here’s the FERC Field Test document;
http://www.nerc.com/docs/standards/sar/2007-12_BAL-003-1_FR_Field_Test_Document_20110204.pdf
Here’s the operative quote;
“BAL-003-1 proposes to bring Frequency Bias Settings closer to Frequency Response. The drafting team proposes to reduce the minimum Frequency Bias Settings over a period of years. The drafting team proposes to establish a new minimum Frequency Bias Setting in 2011 (-0.8% of peak/0.1Hz, compared to the present -1% of peak/0.1Hz). The drafting team, NERC and the Resources Subcommittee will observe the impact on frequency and will implement a reversion plan as necessary.”
Looks to me like they are proposing tighter not looser frequency control and reading the other docs mainly between interconnections.
Another doc that outlines the proposed standards;
http://www.nerc.com/docs/standards/sar/2007-12_BAL-003-1_Implementation_Plan_20110204.pdf
Again the operative numbers;
May 2011 through December 2011 -0.8% of peak/0.1 Hz
January 2012 through December 2012 -0.6% of peak/0.1 Hz
January 2013 through December 2013 -0.4% of peak/0.1 Hz
January 2014 through December 2014 -0.2% of peak/0.1 Hz
January 2015 through -0.0% of peak/0.1 Hz
The attachment A doc explains that caculation and implies the current guideline is -1% of peak/0.1Hz;
http://www.nerc.com/docs/standards/sar/2007-12_BAL-003-1_Attachment_A_20110204.pdf
I’ve seen enough to conclude the AP and linked article authors don’t know what they are talking about.
They are, though that's a somewhat different issue from the experimental elimination of time error correction. And they should - I watch the frequency as part of my job, and I would agree that tightening the frequency response requirements might be a good idea.