It's an interesting study, but it had three glaring errors. One, it assumed incompetent admins. Two, it was about hypothetical systems, not real and tested (it was basically just counting exploits).
Three, well, I don't know, I'd have to see the data. It appears that they counted vulnerabilities disclosed and patched during a set period, but this wouldn't count the outstanding exploits at the start of the study (and Windows has loads of those). Plus, Red Hat is known to fix critical exploits very quickly, leaving trivial ones on the back burner for a while. This would definitely pump up the unpatched days number. Meanwhile, Microsoft has been known to leave critical exploits unpatched for months.
I hope that one of these days somebody will do an impartial real-world study. This was appears impartial, but definitely not real-world.
"I hope that one of these days somebody will do an impartial real-world study. This was appears impartial, but definitely not real-world."
A few years ago, I read several comparisons between Visual Basic and Delphi (my favorite programming language). Of all that I read (and I know VB has had some major upgrades since then), only 1 decided in favor of VB. In only that one did the listed benchmarks for VB surpass D. There were two problems, though.
1. The testers were VB gurus who had never used Delphi before.
2. Microsoft had funded the study.
Nope, no bias there.