Posted on 12/11/2002 6:28:08 AM PST by A2J
Judge not, that ye be not judged. For with what judgment ye judge, ye shall be judged: and with what measure ye mete, it shall be measured to you again. - Matthew 7:1-2I don't come to the same conclusion you do. I see that passage as an attack on double standards. I'm quite prepared to judge people, and to be judged by the very same standards that I use in judging others.
I am the truth. . ..(John 14:6)
"For most of time, men and women have assumed that the truth was there to be found out."
"This began to change in the 19th century and the change gathered pace in the 20th. As people began to question the existence of God, it became obvious to some, that the existence of truth requires the existence of God. With God dismissed, it became impossible to conceive of truth in any absolute sense. This has resulted in the humiliation of truth. Truth is now whatever you would like it to be."
"Truth's demise has filtered down through the great centers of learning, the arts, and on into streets and homes. Everything is possible with truth gone. Everything is permissible. Musicians make music that doesn't sound musical. Painters paint pictures that are incomprehensible to normal folk. Playwrights write plays that are nonsense, and architects design buildings that no one can understand."
"All this is put forward as legitimate, but what does it all mean?
No matter how much... popular culture---is encouraged to believe in the relativity of truth, no one can build a decent life on such a notion."
"Inevitably proponents of freedom from God, and from absolute truth, are obliged to reach outside of their own system, and borrow something from theism in order to make their lives work."
"The person who believes that everything is valid, will soon find that he is condemned to meaninglessness."
"Christ is a standing offer of escape from such a hell as this To believe that truth is like Christ, is salvation indeed."
I know that entire libraries have been written on this subject, and as a layman who hasn't devoted years to the study of scriptural doctrines, I'm quite unqualified to debate with learned theologians. But as I see it, the object is to live one's life so well that judgment is not to be feared, and mercy isn't required. (But if I'm found wanting, I'll be grateful for all the help I can get.) Anyway, I'm judgmental; but I don't exempt myself from my judgments.
Aye, it is really the complexity of our language abilities that sets us apart, with most of the complexity relating to our ability to construct a relatively large number of short-term abstractions at once. This is perfectly consistent with the best theoretical models of how this ability scales with the "brain" size (in the abstract -- it doesn't really HAVE to be a brain). Some animal studies also seem to bear this out.
What I'm wondering is whether the advanced language skill of humans is one of the considerations in A.I. work-in-progress? If so, is that treated as a symbolization of acquired information in learning? Are there any rules for it?
Oy, very complex topic. However, general language abilities are coming along nicely. It takes really big computers to develop similar capabilities to the same level of quality. Using these models, your average workstation is still lacking in the necessary RAM by a few orders of magnitude. Memory bandwidth and latency are ultimately the bottlenecks for computers competing with the human brain when doing things that require relatively deep abstraction. Unfortunately, these don't improve as fast as core speed.
I want that on a bumper sticker.
Or like "laserness" or like "bevatronness" or like "2008 Corvetteness" or like "sonataness" or like "cubismness"? Where did bevatronness exist during the 18th century?
Dreams of fire hydrants and slow mailmen, one might suspect.
;-)
Well sure, that's what is in my head. What about the dog?
Not sure if you are speaking of abstraction in general or the handwriting question in particular. I know that in handwriting (and text recognition) the only way they have been able to make it more accurate is by using a dictionary to help the program 'guess' the letter. The problem with this is that it is not the way humans work and the discussion is about AI and trying to make a computer which at least can make decisions in a human way. For example with this solution (even if it performed in English as well as a human) it could not read the handwriting of someone writing in the same alphabet in another language, so it would have to be 'taught' Spanish, German, French, etc. whereas a human would not have such a problem.
Now abstraction is a fairly important problem for AI. Clearly, even with fast computing and large databases, you would need huge amounts of both instructions and data for it to perform as a human. If it could be easily achieved, then I am sure that there would be numerous examples from 'non-secret' software which someone in the field would be aware of. For example the recent implementation of face recognition in many places has shown a big problem with the software - it can be easily deceived by hats, mustaches, eyeglasses, etc. A human would still recognize the face. This is pretty much state of the art stuff so I doubt the problem has been solved, and there is good reason for this:
Developing software is concerned with building models of manual processes. A program is an abstraction from reality. That is, we identify the important qualities or properties of the thing being modeled and discard the irrelevant ones. We abstract out the ideas that are relevant to the process we are trying to model.
From: Abstraction
All of computing is in a sense about absstraction, however, the abstraction is done by humans, choosing what to keep and what to discard. In other words it is reductionistic. Which brings us to the problem of the table. One may start by saying that if an object has a smooth square surface on the top and four legs it is a table. However, that does not encompass all that humans would consider a table. A table can be triangular, rectangular, round and can have any number of legs. A computer would have to be 'taught' through a long set of instructions how to account for all these varied types of tables. A human does not need to be given such instructions, he determines what a table is through insight, abstraction. Now we have only discussed letters, and tables, imagine the problem that computers would have just determining what each object in a house is - and it still would be a guess and perform less well than a human.
Let me add a bit to what Alamo-Girl said. You are being too literal. Plato used a table as an example. We have an idea of what a table is, we do not need to have seen every table to recognize an object as a table when we see a new one. How we do this is a matter for debate perhaps, but that we can do it is beyond doubt. It is also very doubtful (to me at least) if we could function at all without this ability.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.