If this means what I think it means -- that the brain could theoretically be halted like a computer and it memory dumped for analysis -- I think you're wrong.
Everything about the brain suggests a lot of analog processing going on. Not to mention there has never been any demonstration that even the simplest information can be captured and decoded.
There has been a tremendous effort to replace damaged sensory inputs -- hearing for example -- with computerized prosthetics. The results so far indicate two things: we don't know how sound is converted into usable nerve impulses, and the brain is so adaptable that it can learn, with time and effort, to use crappy inputs.
There is no intrinsic difference between analog and digital hardware. Both analog and digital are an information coding format. In practice, modern digital has more real world resolution. The biggest problem with interfacing with the brain is that our man-made computational hardware format (mostly-serial clocked logic) is very different than the computational hardware format of the brain. This makes it a very difficult engineering problem, though not impossible. Some very significant improvements in silicon/neuron interfacing technologies have been made in the last few years.
All the information could definitely be captured and decoded in theory. The problem is that we aren't very good at working with that type of hardware because it is so very different from what we are used to working with. The problems are slowly being worked out and eventually it will become a reality, probably sooner than later.