There is no intrinsic difference between analog and digital hardware. Both analog and digital are an information coding format. In practice, modern digital has more real world resolution. The biggest problem with interfacing with the brain is that our man-made computational hardware format (mostly-serial clocked logic) is very different than the computational hardware format of the brain. This makes it a very difficult engineering problem, though not impossible. Some very significant improvements in silicon/neuron interfacing technologies have been made in the last few years.
All the information could definitely be captured and decoded in theory. The problem is that we aren't very good at working with that type of hardware because it is so very different from what we are used to working with. The problems are slowly being worked out and eventually it will become a reality, probably sooner than later.
Huh? I thought digital was meaningless unless there is an agreement between sender and receiver? Analog gives information regardless of a receiver's understanding, and so is not a "coding format." Did I miss something?
There is a huge difference between analog and digital computing. Analog computing is "instantaneous", limited only by the bandwidth of the components taken as a whole -- but no serial processing time. This is the only kind of computing that could have the performance of the brain when the components are limited to about 100 hz.
If you think of the brain as having 10s of billions of A/D converters, each with hundreds of simultaneous inputs, it'll take while to understand and mimic this complexity.