Yeah, I totally agree. Actually the entire thread is very entertaining if you like watching software bake. Woodhouse also commented on HackerNews that he wasn't offended at all. Later in the thread Woodhouse also rebutted some of objections by pointing out that Linus was mistaken about the context. Also he doesn't work for Intel any more, but obviously has a close relationship still.
But I have to say it's very disturbing to think that that a microcode change could also silently invalidate the retpoline approach. Now I'm having paranoid fantasies of secret NSA microcode updates that render systems open kimono and we can't even tell the difference!
There has also been criticism that implies this an emergent property of bashing the X86 architecture into the VM server era. However I recall similar bugs from the mainframe and minicomputer eras, e.g. amusing race conditions that could allow executing code to exceed the scope of a virtual environment. So even purpose built vm hardware from past generations had similar problems.
Unfortunately many people think that computing is very deterministic and newtonian. But in fact it's built on quantum mechanics and full of Gödelian gotchas. Thus I predict there will always be exploits and since our technological civilization is completely dependent on computing, it will continue to be worth the trouble of sniffing these things out, no matter how obscure they may seem.
I'd have to disagree with you here to a certain degree. We're still dealing with a deterministic environment. The big problem we're running into is that the entire environment has become so complex that it's become difficult to predict, especially when you are dealing with things like this that are extremely time-sensitive. From some of what I've read about the way that Intel (and others apparently) were doing their predictive execution threads, it sounds like they may have made some bad choices for expediency's sake that bought them a little speed, while they didn't recognize the potential there for it being subverted. Indeed, when the design decision was made, I don't think the attacks that have since been realized were even theoretical. (though perhaps they should have been)
For myself, as a longtime professional nerd who has really enjoyed watching our advances from the 8088 chips to the monsters we use now, I'm actually more astounded that we don't find more errors in these chips than we do given their current levels of complexity. If Microsoft could create software half well, MS-Windows wouldn't be the virus/trojan feeding frenzy that it has become.
Speaking of software quality, I think that history has shown the quality of the Linux kernel has proven itself fairly well over time. Granted, BSD has a better history (IMO), but compared to some others out there, it stacks up fairly well, so it would seem to be wise to consider Linus' point of view on this. Though he's not the most people-person type guy out there, he has been willing to own up to errors of his own in the past, and given the aforementioned quality of the product he shepherds, I'm willing to give him the benefit of the doubt most of the time.