Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article

To: js1138; RadioAstronomer; tortoise
Can't speak for tortoise, but in previous encounters he has asserted the theoretical problem has been solved, but the hardware problem is difficult.

My understanding is that the theoretical solutions that are publicly known require unreasonable amounts of computing resources, growing exponentially with complexity so Moore's Law doesn't help. There are efforts underway to create good approximations that can run on today's hardware.

Another approach is to simulate the human brain directly, and the hardware will be available for that within a few decades unless Moore's Law hits a wall. Estimates of the brain's computational power are around 10^16 operations per second. Blue Gene does 300 teraflops (3*10^14) today.

7,005 posted on 08/21/2006 10:59:44 AM PDT by ThinkDifferent
[ Post Reply | Private Reply | To 6984 | View Replies ]


To: ThinkDifferent; js1138; RadioAstronomer
Re: computational requirements for human level intelligence

Simulating the human brain at any level of detail is computationally very expensive and still a couple of orders of magnitude beyond our biggest systems. Virtually all estimates like this are based on models of biological equivalence, not computational equivalence. There is a very good argument to be made that biological equivalence for computing power is only as useful with respect to intelligence as the model being used.

Computational equivalence (equivalent silicon required to implement functions and structures) is being actively worked on by both the wetware neural modeling researchers and the theoretical computer science folks (mostly converging on Bayesian computational models from somewhat different directions). In the last few years, the average estimates for computational equivalence from both the wetware and hardware researchers have decreased significantly and from different research directions. Both the neural guys and the compsci guys seem to be estimating that computational equivalence is about two orders of magnitude below the biologically equivalent model.

The wetware researcher models are slowly converging with the compsci models, which is partly why their estimates roughly track each other. One of the major drivers of insane computational needs has been the conservative need to compute the state of the entire data structure at every time quantum. Current computational models on both sides of the fence have become much sparser and less pathological from a silicon standpoint. The models are still massively parallel (kind of like millions of very simple bucket-brigade stack machine macro structures), but the algorithms allow large quantities of state to be ignored in any time quantum.

More and faster CPU/memory are always welcome though, because even the mediocre approximations have a bad habit of unbounded resource growth under ideal conditions.

7,025 posted on 08/22/2006 4:35:25 PM PDT by tortoise
[ Post Reply | Private Reply | To 7005 | View Replies ]

Free Republic
Browse · Search
Smoky Backroom
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson