That 30-35% factor mostly reflects how much the wind blows (or, doesn't), right? I mean, sure, if you put a meter on the turbine, it might produce 35% of its rated capacity in a given year.
So, it'd be interesting to see how they came up with 8.7%. Forgive my ignorance but, I could only guess it might have something to do with further reducing your figure by accounting for how much of that energy is consumed vs. how much is 'wasted', how easy (or difficult) it is to make use of the energy coming from the turbines when the wind is blowing. I suppose if the rest of your generating capacity comes from gas-fired plants, it's theoretically possible to make the best use of your wind farms. (Again, forgive me if I'm out in left field.)
Of course, some amount of the energy produced by conventional power plants is wasted, too. I wonder what the capacity factor would be for a typical nuke.
I'm guessing that particular operator is penalizing windfarms for not producing that 30-35 percent power at all times. We've seen days in iowa where a windfarm is producing over 75 percent all day and all night but in the summer there are days when they produce very little.