I understand that the machine language instruction sets, the methods of accessing memory, etc are different. But I was curious if there was actually something about it that made it more susceptible to exploits. The only think I can think of is that the vast majority of the buffer-overflow type of exploits have been geared towards the x86 instruction set, partly because Windows allows for these exploits and partly because it is simply so prevalent. So, even though you wouldn't be able to rely on the various exploitable DLLs in OS X, you might still have a bunch of machine language code that had been tested on the hardware already.
Inherently? No, there's nothing about x86 that leaves a system more inherently vulnerable, which is the premise of this article.