Actual annual capacity factor is more like 30-35 percent irrespective of what a grid operator wants to say.
That 30-35% factor mostly reflects how much the wind blows (or, doesn't), right? I mean, sure, if you put a meter on the turbine, it might produce 35% of its rated capacity in a given year.
So, it'd be interesting to see how they came up with 8.7%. Forgive my ignorance but, I could only guess it might have something to do with further reducing your figure by accounting for how much of that energy is consumed vs. how much is 'wasted', how easy (or difficult) it is to make use of the energy coming from the turbines when the wind is blowing. I suppose if the rest of your generating capacity comes from gas-fired plants, it's theoretically possible to make the best use of your wind farms. (Again, forgive me if I'm out in left field.)
Of course, some amount of the energy produced by conventional power plants is wasted, too. I wonder what the capacity factor would be for a typical nuke.
As one that has several years of grid operation under my belt including the use of wind generation, I know better. 20% is a realistic figure over the course of a year and that's from actual data, not speculation. Less in areas where wind isn't as suitable, more where wind is more suitable.