1 / 66

ttcra

ttcra.ppt. 20110419. How to Pass a Turing Test and Escape from the Chinese Room. William J. Rapaport Dept. of Computer Science & Engineering Department of Philosophy, Department of Linguistics and Center for Cognitive Science rapaport@buffalo.edu http://www.cse.buffalo.edu/~rapaport.

onan
Download Presentation

ttcra

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ttcra.ppt 20110419

  2. How to Pass a Turing TestandEscape from the Chinese Room William J. Rapaport Dept. of Computer Science & Engineering Department of Philosophy, Department of Linguistics and Center for Cognitive Science rapaport@buffalo.edu http://www.cse.buffalo.edu/~rapaport

  3. How Computers Can Think • Computational cognitive science (symbolic or connectionist): • Cognition is computable • Philosophical implication: • IF cognitive states & processes can be expressed as algorithms, THEN they can be implemented in non-human computers.

  4. How Computers Can Think • Are computers executing such cognitive algorithms merely simulating cognitive states & processes? • Or are they actually exhibiting them? • Do such computers think? • Answer: Turing’s Test Objection: Searle’s Chinese-Room Argument My reply: “Syntactic Semantics”

  5. The Imitation Game MAN “I’m the woman” WOMAN “I’m the woman” INTERROGATOR

  6. The Turing TestThe Imitation Game COMPUTER MAN “I’m the woman” WOMAN “I’m the woman” INTERROGATOR

  7. The Turing Test #2The Imitation Game COMPUTER MAN MAN “I’m the woman” WOMAN “I’m the woman” man man INTERROGATOR

  8. The Turing Test #3The Imitation Game COMPUTER HUMAN MAN “I’m the woman” WOMAN “I’m the woman” human human INTERROGATOR

  9. The Turing Test “I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.” - Turing 1950 Questions Responses I H? / C?

  10. Thinking vs. “Thinking” © The New Yorker, 5 July 1993

  11. Thinking vs. “Thinking”, cont’d. • Cartoon works because: • One does not know with whom one is communicatingvia computer • Nevertheless, we assume we are talking to a human • I.e., entity with human cognitive capacities = Turing’s point, namely: • Argument from Analogy: • Solution to Problem of Other Minds = I know I think; how do I know you do? • You are otherwise like me  (probably) you are like me w.r.t. thinking

  12. Thinking vs. “Thinking”, cont’d. • What’s wrong with the Argument from Analogy: • I could be wrong about whether you’re biologically human. • Am I wrong about your being able to think? • About your (human) cognitive abilities? • Turing: No! • More cautiously: • Whether I’m wrong depends on definition of (human) cognitive abilities/thinking

  13. Thinking vs. “Thinking”, cont’d. • An analogy: • Birds fly. • Do people fly? • Only on airplanes. • Do planes fly? • They don’t flap their wings.

  14. Thinking vs. “Thinking”, cont’d. But planes do fly: • Metaphorical extension (Lakoff/Johnson) • Planes fly = planes “fly” • Turing: “use of words” changed • Flapping wings not essential to flying • Physics of flight is same for birds & planes; • More general, abstract theory of flying • Turing: “general educated opinion” changed NB: Use of ‘fly’ & general educated opinion have changed. • Spaceships “fly”; planes fly! • Single abstract theory can account for metaphorical extension

  15. What Is a Computer?

  16. Thinking vs. “Thinking”, cont’d. • pre-1950: computers were (only) human • 2000: computers are (only) machines • General educated opinion: • Computers are viewed not as implementing-devices but in functional, I/O terms • Ditto for ‘think’ (more later) • BUT: it’s not really thinking (?)

  17. The Chinese-Room Argument • It’s possible to pass TT, yet not (really) think story + questions H I (in Chinese) (who can’t (native Chinese understand Ch.) speaker) + responses (Eng.) program (in fluent Chinese) for manipulating [Ch.] “squiggles”

  18. The Chinese-Room Argument(s) • Argument from biology: (b1) Computer programs are non-biological (b2) Cognition is biological (b3) No non-biological computer program can exhibit biological cognition. • Argument from semantics: (s1) Computer programs are purely syntactic (s2) Cognition is semantic (s3) Syntax alone is not sufficient for semantics. (s4) No purely syntactic computer program can exhibit semantic cognition.

  19. The Chinese-Room Argument(s) • Argument from biology: (b1) Computer programs are non-biological X (b2) Cognition is biological (b3) No non-biological computer program can exhibit biological cognition. • Cognition can be characterized abstractly & implemented in different media • Argument from semantics: (s1) Computer programs are purely syntactic (s2) Cognition is semantic X (s3) Syntax alone is not sufficient for semantics. (s4) No purely syntactic computer program can exhibit semantic cognition. • Syntax suffices for semantics • Better: Try to build a CR • What’s needed for NLU?

  20. Searle on the CRA

  21. Searle on the Chinese-Room Argument • “I still don’t understand a word of Chinese • And neither does any other digital computer because all the computer has is what I have: • A formal program • that attaches no meaning, interpretation, or content to any of the symbols. • Therefore, no formal program by itself is sufficient for understanding.” • NB: A program that did “attach” meaning, etc., mightunderstand! • BUT: Searle denies that, too …

  22. Searle on the CRA, cont’d. • “I see no reason in principle why we couldn’t give a machine the capacity to understand English or Chinese • since in an important sense our bodies with our brains are precisely such machines. • But we could not give such a thing to a machine whose operation is defined solely in terms of computational processes over formally defined elements.” … Because …

  23. Searle on CRA, cont’d. … because… • “Only something having the same causal powers as brains can have intentionality” • i.e., cognition … and what are these “causal powers”?

  24. Searle on CRA, cont’d. • “These causal powers are due to the (human) brain’s biological (i.e., chemical and physical) structure” … namely …?

  25. Searle on CRA, cont’d. • A simulated human brain… • “made entirely of old beer cans rigged up to levers and powered by windmills”… • would not really exhibit intentionality(i.e., cognition) even though it appeared to.

  26. Searle on CRA, cont’d. • Why must intentionality/cognition be biological? • Searle: • Only biological systems have the requisitecausal properties to produce intentionality • What are the causal powers? • Searle: • The ones that can produce perception, action, understanding, learning, and other intentional phenomena. • Isn’t this a bit circular?

  27. Searle on CRA, cont’d. • Possible clue to what the causal powers are: • “Mental states are both • caused by the operations of the brain • realized in the structure of the brain” • I.e., implemented in the brain.

  28. The Implementation Counterargument

  29. The Implementation Counterargument • Searle: • “Mental states are as real as any other biological phenomena, as real as lactation, photosynthesis, mitosis, or digestion. • Like these other phenomena, mental states are caused by biological phenomena and in turn cause other biological phenomena.” • Searle’s “mental states” are implementations of abstract mental states.

  30. Implementation, cont’d • “Intentional states are both caused by & realized in the structure of the brain.” BUT: Brains & beer-cans/levers/windmills can share structure. • ¬ 1. • “Intentional states are both caused by & realized in the neurophysiology of the brain.” •  “Intentional states stand in causal relation to the neurophysiological.” • “Intentional states are realized in the neurophysiology of the brain” I.e., they are implemented in the brain.

  31. Abstract Data Type Stack Natural numbers Peano axioms Musical score Play script Liquid Liquid Mental states Mental states Abstractions can be implemented in > 1 way “multiple realizability” Implementation Array, list Any sequence of items satisfying Peano axioms Performance Performance Water Alcohol Some brain states/procs Some computer sts/procs Implementation, cont’d.

  32. Summary of Biological Argument • Searle: • Understanding is biological • Human brain can understand Chinese BUT: computer running Chinese NL programcannot understand Chinese • Rapaport (et al.): • On abstract, functional, computational notion of understanding as an ADT, • understanding can be implemented in both human brain & computer  Both can understand.

  33. Implementation, cont’d. •  implementation I,  abstraction A such that I implements A •  abstraction A,  > 1 implementation I • I.e., A can be implemented in other ways • “multiple realizability” A I I'

  34. Implementation, cont’d. Cognition • abstractly (i.e., psychologically, computationally) conceived • The subject matter of cognitive science Cognition Cognition implemented implemented in brain in computer • Use of terms changes, because… • General educated opinion changes

  35. Argument from Semantics (s1) Computer programs are purely syntactic (s2) Cognition is semantic (s3) Syntax alone is not sufficient for semantics (s4)  No purely syntactic computer program can exhibit semantic cognition ¬ (s3): Syntax suffices for semantics!

  36. Syntactic Semantics:How Syntax Can Suffice for Semantics • Semantics (relations between symbols & meanings)  syntax (relations among symbols & symbolized meanings)  syntax can suffice for semantic interpretation • Semantics is recursive: • We understand syntactic domain in terms of antecedently-understood semantic domain • Base case = syntactic understanding 3. Internal, narrow, 1st-person point of view is what’s needed for understanding/modeling cognition.

  37. Syntactic semantics • Searle: • Syntax cannot suffice for semantics because links to external world are missing • 2 assumptions: • Computer has no links to external world • Solipsism? • External links are needed to attach meanings to symbols • But, if so, then computers can have them just as humans do

  38. Semiotics • Given: A formal system of “markers” • “symbol system” • Syntax = study of relations among markers • How to recognize/identify/construct them • What they look like; grammar • How to manipulate them • Proof theory • No relations between markers & non-markers • Not “symbols” that wear their semantics on their sleeve

  39. Semiotics (cont’d) • Semantics = study of relations between markers & “meanings” • “meanings”: part of a different domain of semantic interpretations (an “ontology”) • Syntax can’t/doesn’t suffice for semantics! • Pragmatics • Study of relations between markers & interpreters (?) • Study of relations among markers, meanings, interpreters, & contexts (?) • I.e., everything else (!)

  40. Syntactic semantics • If set of markers is unioned with set of meanings • & if union is taken as a new set of markers • I.e., meanings are internalized in the symbol system • Then what was once semantics • I.e., relations between old markers & meanings becomes syntax • I.e., relations among new markers • (& thus syntax can suffice for semantics) • Digression: • Linguistic semantics (esp. cognitive semantics) is like this (???)

  41. SYN DOM Syntax • •

  42. SYN DOM Syntax • • SEM DOM SYN DOM • Semantics • • •

  43. SYN DOM Syntax • • SEM DOM SYN DOM • Semantics • • • Syntactic semantics • • • •

  44. Syntactic Semantics I:Turning Semantics into Syntax • Can the semantic domain be internalized? • Yes: • Under the conditions obtaining for human language understanding • How do we learn the meaning of a word? • How do I learn that ‘tree’ means tree? • By association … (of tree with ‘tree’? No!) • … of my internal representation of ‘tree’ with my internal representation of a tree • Internal representation could be activated neurons • Binding of multiple modalities

  45. Robot reply: Add sensors & effectors • Searle: That’s just more symbols • Rapaport: Correct! Link betw. (a) & (b) (b) Semantic network • Internal • representation • of tree Causal link Cassie

  46. Syntactic semantics • Ditto for computer (Cassie) • I say something to Cassie in English • Cassie builds internal nodes representing my utterance • I show pictures to Cassie (or Cassie sees something) • Cassie builds internal nodes representing what she sees • These 2 sets of nodes are part of same KB • Ditto for formal semantics: • Syntax & semantics are both defined syntactically

  47. Points of View • To understand how a cognitive agent understands, • & to construct a computational cognitive agent • We must take 1st-person point of view • What’s going on “in” agent’s head • From agent’s POV • “methodological solipsism” (Fodor, Putnam) • “narrow” or “internal” cognition • Don’t need to understand causal/historical origins of internal symbols • Searle-in-CR’s POV vs. interrogator’s POV: • CRA: S-in-CR’s POV trumps interrogator’s POV • TT & SynSem: interrogator’s POV trumps S-in-CR’s

  48. (From Baum, Wizard of Oz, 1900: 34-35)When Boq [a Munchkin] saw her silver shoes, he said,“You must be a great sorceress.” “Why?” asked [Dorothy].“Because you wear silver shoes and have killed the wicked witch. Besides, you have white in your frock, and only witches and sorceresses wear white.”“My dress is blue and white checked,” said Dorothy….“It is kind of you to wear that,” said Boq. “Blue is the color of the Munchkins, and white is the witch color; so we know you are a friendly witch.”Dorothy did not know what to say to this, for all the people seemed to think her a witch, and she knew very well she was only an ordinary little girl who had come by the chance of a cyclone into a strange land.

More Related