1 / 89

Two models of accentuation that the brain might use

Two models of accentuation that the brain might use . John Goldsmith University of Chicago. What’s the point?. Take a step back from linguistic analysis, and ask: what is the simplest way to perform the computations that are central and important for the data of metrical systems?

gary-norris
Download Presentation

Two models of accentuation that the brain might use

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Two models of accentuation that the brain might use John Goldsmith University of Chicago

  2. What’s the point? • Take a step back from linguistic analysis, and ask: what is the simplest way to perform the computations that are central and important for the data of metrical systems? • What kind of […neural…] hardware would be good at performing that kind of computation?

  3. An [implicit] assumption... Linguistics is an autonomous profession, with its own autonomous methods… But it does not address an autonomous reality, nor have its autonomous truth. It shares truth with all other disciplines. And...

  4. …there is no certainty that linguistic methods will get us as close to the Truth as we wish. No guarantee that linguistic methods will get us arbitrarily close to the truth. Put another way, it may be that pouring more data (of a certain sort) into contemporary linguistic methods gives us a theory that is overtrained on its data. The only escape from that is to cooperate with other research traditions.

  5. Also implicit... • Linguistics and non-linguists (psychologists, neurocomputational modelers) must each take a step towards each other to find a common middle ground. • This means...

  6. Non-linguists... • must realize that English is an outlier among languages... Language English

  7. Linguists... • must acknowledge that much of their theory-building is motivated by their everyday convenience. (For example, they strongly prefer models whose computation requires paper or a blackboard at least, but also at most.)

  8. Linguistics computation biology

  9. Two models in neurocomputing: 1. In space: lateral inhibition Work done jointly with Gary Larson. Discrete unit modeling. [ 2. In time: neural oscillation ]

  10. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  11. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  12. Let’s look at the program --

  13. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  14. Metrical phonology: work during 1975-1985 • Mark Liberman • Liberman and Prince • Morris Halle, J.-R. Vergnaud • Alan Prince • Bruce Hayes, especially Metrical Stress Theory (1995)

  15. Patterns of alternating stress:the simplest cases “Create trochaic feet, from left to right” [ x x x x x x x . . .

  16. Patterns of alternating stress:the simplest cases “Create trochaic feet, from left to right” [ x x x x x x x . . . S W

  17. Patterns of alternating stress:the simplest cases “Create trochaic feet, from left to right” [ x x x x x x x . . . S W S W

  18. Patterns of alternating stress:the simplest cases “Create trochaic feet, from left to right” [ x x x x x x x . . . S W S W S W

  19. Patterns of alternating stress:the simplest cases “Create trochaic feet, from left to right” [ x x x x x x x . . . S W S W S W

  20. Patterns of alternating stress:the simplest cases “Create trochaic feet, from left to right” [ x x x x x x x . . . S W S W S W S W . . . x x x ] final dactyl

  21. Patterns of alternating stress:The other way... • “Create trochaic feet, from right to left” S S W W x x x x x x x x ]

  22. Patterns of alternating stress:The other way... • “Create trochaic feet, from right to left” S W S W x x x x x x x x ]

  23. Patterns of alternating stress:The other way... • “Create trochaic feet, from right to left” S W S W S W x x x x x x x x ]

  24. Patterns of alternating stress:The other way... • “Create trochaic feet, from right to left” S W S W S W S W x x x x x x x x ]

  25. Patterns of alternating stress:The other way... • “Create trochaic feet, from right to left” S W S W S W S W S W [x x x x x x x x x x x ] initial dactyl

  26. This is all very convenient, but... Should be be thinking about constructing structure? Computing a result? What’s the minimal way to compute the right result?

  27. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  28. Dynamic computational network Final activation Initial activation b a

  29. Beta = -.9: rightward spread of activation

  30. Alpha = -.9; leftward spread of activation

  31. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  32. Examples (Hayes) Pintupi (Hansen and Hansen 1969, 1978; Australia): “syllable trochees”: odd-numbered syllables (rightward); extrametrical ultima: S s S s s S s S s S s S s s S s S s S s S s S s S s s

  33. Weri (Boxwell and Boxwell 1966, Hayes 1980, HV 1987) • Stress the ultima, plus • Stress all odd numbered syllables, counting from the end of the word

  34. Warao (Osborn 1966, HV 1987) Stress penult syllable; plus all even-numbered syllables, counting from the end of the word. (Mark last syllable as extrametrical, and run.)

  35. Maranungku (Tryon 1970) Stress first syllable, and All odd-numbered syllables from the beginning of the word.

  36. Garawa (Furby 1974) (or Indonesian, …) • Stress on Initial syllable; • Stress on Penult; • Stress on all even-numbered syllables, counting leftward from the end; but • “Initial dactyl effect”: no stress on the second syllable permitted.

  37. Seminole/Creek (Muskogean) High tone falls on the final (ultima) or penult, depending on a parity-counting procedure that starts at the beginning of the word (“parity-counting” means counting modulo 2: 1, 0, 1, 0: like counting daisy-petals). High tone is on a count of “0”. But a heavy syllable always gets count “0”. In words with only light syllables (CV): 1 0 1 0 1 0 1 1 0 1 0 1 0 1 0

  38. S ] T S S T Harmonic conditioning: improves well-formedness

  39. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  40. = Network M Input (underlying representation) is a vector Dynamics: (1) Output is S*: equilibrium state of (1), which by definition is: Quite a surprise! Hence:

  41. Inversion, again -- note the near eigenvector property Dynamics: Output is S*: equilibrium state of , which by definition is: U = S0 Hence: M*S0 S1 M*S1 S2 (I is the identity matrix) M*Sn S* = Sn

  42. Fast recoverability of underlying form This means that if you take the output S* of a network of this sort, and make the output undergo the network effect once — that’s M S* — [M’s a matrix, S a vector] and subtract that from S* — that’s (I-M) S* — you reconstruct what that network’s input state was. (This would be a highly desirable property if we had designed it in!)

  43. learnability

  44. Dynamic computational nets • Brief demonstration of the program • Some background on (some aspects of) metrical theory • This network model as a minimal computational model of the solution we’re looking for. • Its computation of familiar cases • Interesting properties of this network: inversion and learnability • Link to neural circuitry

  45. neural circuity

  46. The challenge of language: • For the hearer: he must perceive the (intended) objects in the sensory input despite the extremely impoverished evidence of them in the signal -- a task like (but perhaps harder than) visual pattern identification; • For the speaker: she must produce and utter a signal which contains enough information to permit the hearer to perceive it as a sequence of linguistic objects.

More Related