This is something of an urban legend. No one knows what capacity of our brain we use, I have seen estimates of 70% or more, with the remaining 30% being redundant but still usefull. The 10% figure is something somebody once through out, and it has stuck like fact. I'll let the rest of your post stand though.. I just like to stomp on these little "factiods" though..
How do you know that we use 10% of our brain capacity, especially since the mind cannot ultimately be reduced to matter?
It is impossible to know the limit of thought. But since the mind (an aspect of the spiritual soul) has the power to apprehend all things presented to it, it is in a sense all things, as Aristotle said.
This is an urban myth. All critters use all their neurons. Furthermore, there is a trade-off between breadth of data and predictive accuracy for any fixed amount of hardware. Different people use their hardware differently, though minor differences in hardware can make a big difference practical capability. What this means is that everyone is always using all their hardware and many differences from person to person have to do with both how much capacity they have AND how that capacity is allocated.
I would argue that to make a computer that knows all finite-states would be impossible with our current understanding of computing technology. For one thing, there is the infinite recursion problem, because you also have to know the finite states inherent in the computers, but since you recurse, you also have to know all those states too...).
It would seem so at first glance, but infinite recursion on any finite state machine is a finite state process. If it wasn't, it wouldn't be expressable on an FSM. Still for some FS processes, even computers that are astronomically larger than what we use today would only be able to poorly model them. And if the universe is infinite, it would in fact not be possible to model all things in the universe on a FSM.