Posted on 01/12/2004 3:01:58 PM PST by John Jorsett
Edited on 04/13/2004 2:45:26 AM PDT by Jim Robinson. [history]
Suppose that in the mid 1700s there was a group of blacksmiths and canal engineers who preached the gospel of the coming "Industrial Revolution". They would call themselves "Industrians", and spend their time promoting the idea to the masses, preparing themselves for the coming era of prosperity, and generally living an "industrial lifestyle". Would that have moved the industrial revolution forward in the slightest? Would that have had anything to do with anything? Not a chance.
I say the Singularity really is coming, and I welcome it. The best humanity can do is to rush headlong into it, and the best I can do is to push that along as far as possible, as fast as possible. And how do I do that? Oh, by doing my work and living my life.
And that includes eating Pringles and watching football. GO EAGLES!!
This is the kind of dork who'll trip over his own shoelaces and crack his head open on the radiator, to be found by the super 3 months later after the neighbors complain about the smell.
It is not strictly necessary, but I don't think most people realize how pathologically slow memory access actually is. For example, if we were to fully trace a 1/10 of a second worth of network activation in the human brain (not that this has any relevance to AI, but to give an idea of the scope of the problem), while a modern CPU like an Opteron is easily within an order of magnitude of "real-time" for the computational requirements in the abstract, the best off-the-shelf memory subsystem available now would take upwards of the better part of a day to access the required memory on a data structure that large. This is a serious problem.
Add to the problem that effective intelligence is a roughly a function of structure size (i.e. "memory"), and you have a severe scaling issue. At some point, an intelligence becomes sufficiently slow that most responses will be irrelevant by the time they are returned due to extremely high latency. Latency in many different aspects is the basic pragmatic limit to intelligence. From a theoretical standpoint, latency limits the effective Kolmogorov complexity of any time-bounded process and therefore the amount of intelligence it can express or bring to bear.
So in theory you are correct, speed has no relation to intelligence. As a pragmatic issue, intelligence isn't much good if it isn't timely.
I agree with a recent definition of AI, which is the point at which any given problem can be solved better by computers than by humans. Implying that the computer would not require specialized programing and would search out the necessary information without human guidance.
This definition bypasses the need to discuss whether the machine is conscious or whether it could "pass" for human. I suspect this level of AI will be achieved long before we start seriously considering artifical consciousness.
I fully expect to live to see medical diagnosis and legal advice computerized. These would merely be extensions of currently available expert systems.
Programming itself will probably be automated in the next 20 years, with genetic algorithms and their successors creating the best code for any given application.
I agree with a recent definition of AI, which is the point at which any given problem can be solved better by computers than by humans. Implying that the computer would not require specialized programing and would search out the necessary information without human guidance.
This definition bypasses the need to discuss whether the machine is conscious or whether it could "pass" for human. I suspect this level of AI will be achieved long before we start seriously considering artifical consciousness.
I fully expect to live to see medical diagnosis and legal advice computerized. These would merely be extensions of currently available expert systems.
Programming itself will probably be automated in the next 20 years, with genetic algorithms and their successors creating the best code for any given application.
Primitive humans when confronted with humans with more advanced technology always perceive the advanced humans as gods.
Humans who evolve into and become augmented as part of the "Singularity" will certainly become gods from our current point of view because we have no understanding of their future capabilities and thought processes. As the "Singularity Humans" continue to evolve at an ever quickening rate they will become a lifeform as foreign to our comprehension as a we are to an ant.
Will they ever become "God"? Only time will tell, but eventually there won't be anyplace else for them to evolve to.
Thanks for the info on latency. I had no idea. As far as what AI will really be? I am in the dark on this one.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.