week 7. learnability grs lx 865 topics in linguistics

51
Week 7. Learnability Week 7. Learnability GRS LX 865 GRS LX 865 Topics in Topics in Linguistics Linguistics

Upload: charles-morrison

Post on 16-Dec-2015

222 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Week 7. Learnability GRS LX 865 Topics in Linguistics

Week 7. LearnabilityWeek 7. Learnability

GRS LX 865GRS LX 865Topics in Topics in

LinguisticsLinguistics

Page 2: Week 7. Learnability GRS LX 865 Topics in Linguistics

Logical problem of Logical problem of language acquisitionlanguage acquisition

The grammar that people end up with is The grammar that people end up with is very complicated, and underdetermined by very complicated, and underdetermined by the data.the data.

The main argument for this (“poverty of the The main argument for this (“poverty of the stimulus”) is that there are many stimulus”) is that there are many generalizations that a kid generalizations that a kid could could make on make on the basis of the input data that would be the basis of the input data that would be wrong, that would not result in a language wrong, that would not result in a language that conforms to the principles that we’ve that conforms to the principles that we’ve discovered seem to hold true of all adult discovered seem to hold true of all adult languages.languages.

Page 3: Week 7. Learnability GRS LX 865 Topics in Linguistics

That-tThat-t

John said Mary will meet Bill tomorrow.John said Mary will meet Bill tomorrow. John said that Mary will meet Bill John said that Mary will meet Bill

tomorrow.tomorrow. Who did John say Mary will meet Who did John say Mary will meet

tomorrow?tomorrow? Who did John say that Mary will meet Who did John say that Mary will meet

tomorrow?tomorrow? Who did John say will meet Bill tomorrow?Who did John say will meet Bill tomorrow? *Who did John say that will meet Bill *Who did John say that will meet Bill

tomorrow?tomorrow?

Page 4: Week 7. Learnability GRS LX 865 Topics in Linguistics

PronounsPronouns

While Mary sat on the T, she read While Mary sat on the T, she read the Metro.the Metro.

While she sat on the T, Mary read While she sat on the T, Mary read the Metro.the Metro.

Mary said she read the Metro today.Mary said she read the Metro today. She said Mary read the Metro today.She said Mary read the Metro today.

Page 5: Week 7. Learnability GRS LX 865 Topics in Linguistics

CSCCSC

John drinks Coke with ice.John drinks Coke with ice. What does John drink Coke with?What does John drink Coke with? What does John drink with ice?What does John drink with ice? John drinks rum and Coke.John drinks rum and Coke. *What does John drink rum and?*What does John drink rum and? *What does John drink and Coke?*What does John drink and Coke?

Page 6: Week 7. Learnability GRS LX 865 Topics in Linguistics

The Subset problemThe Subset problem

Suppose the kid made the wrong choice in Suppose the kid made the wrong choice in each case, and generalized.each case, and generalized. You can either have or not have You can either have or not have that that to to

introduce an embedded clause.introduce an embedded clause. Pronouns can refer to any NP.Pronouns can refer to any NP. To form a To form a whowho-question, just move -question, just move who who to the to the

front and drop the thing it stands for.front and drop the thing it stands for. As far as the kid’s concerned, all of the As far as the kid’s concerned, all of the

sentences that we just saw are good sentences that we just saw are good sentences. (But when the kid grows up, sentences. (But when the kid grows up, s/he’ll know otherwise)s/he’ll know otherwise)

Page 7: Week 7. Learnability GRS LX 865 Topics in Linguistics

The Subset problemThe Subset problem

Of course, the kid will never hearOf course, the kid will never hear *Who did John say that will meet Bill tomorrow?*Who did John say that will meet Bill tomorrow? SheShe(Mary)(Mary) said Mary read the Metro today. said Mary read the Metro today. What does John drink and Coke?What does John drink and Coke?

But the kid will probably also never hearBut the kid will probably also never hear This year’s season of This year’s season of Law & Order Law & Order will be the last.will be the last. ABC just announced a fourth season of ABC just announced a fourth season of Sports Sports

NightNight.. Yet the kid won’t have trouble seeing that Yet the kid won’t have trouble seeing that

these are grammatical.these are grammatical.

Page 8: Week 7. Learnability GRS LX 865 Topics in Linguistics

The Subset problemThe Subset problem

So, the trick is: How can the kid get the So, the trick is: How can the kid get the knowledge (that adults knowledge (that adults dodo have, have, invariably) about what sentences are invariably) about what sentences are unungrammatical, given that simply not grammatical, given that simply not hearing the sentence before is not hearing the sentence before is not evidence.evidence.

The answer: The constraints responsible The answer: The constraints responsible for ruling out the bad sentences are for ruling out the bad sentences are part of the presuppositions made before part of the presuppositions made before acquisition begins—this is UG.acquisition begins—this is UG.

Page 9: Week 7. Learnability GRS LX 865 Topics in Linguistics

So, some language So, some language knowledge is already knowledge is already

therethere So, kids come at the task with some kind So, kids come at the task with some kind of framework into which to fit the things of framework into which to fit the things they learn from the input.they learn from the input.

Languages do differ, so kids need to learn Languages do differ, so kids need to learn what language they’re actually hearing.what language they’re actually hearing.

The basic idea of the Principles & The basic idea of the Principles & Parameters view is that the Principles are Parameters view is that the Principles are part of the human language faculty we part of the human language faculty we come with, and the points of variation (the come with, and the points of variation (the parameters) can differ from language to parameters) can differ from language to language.language.

Page 10: Week 7. Learnability GRS LX 865 Topics in Linguistics

Points of variationPoints of variation

In the GB/P&P type view, kids need to In the GB/P&P type view, kids need to determine the settings for the individual determine the settings for the individual parameters:parameters: Does the V move to T?Does the V move to T? Can the subject be null?Can the subject be null? Which of the possible binding domains does the Which of the possible binding domains does the

language use?language use? What are the bounding nodes for What are the bounding nodes for whwh--

movement?movement? Do any Do any whwh-words move overtly?-words move overtly? Do all Do all whwh-words move overtly?-words move overtly?

Page 11: Week 7. Learnability GRS LX 865 Topics in Linguistics

Points of variationPoints of variation

In an OT view of grammar, the (inherently In an OT view of grammar, the (inherently conflicting) constraints themselves are conflicting) constraints themselves are what UG provides, and kids must what UG provides, and kids must determine which ones take priority over determine which ones take priority over which others?which others? Is it more important to have a subject or to Is it more important to have a subject or to

minimize structure?minimize structure? Is it more important to mark the scope of a Is it more important to mark the scope of a

question with a question with a whwh-word or to avoid the effort -word or to avoid the effort of movement?of movement?

……

Page 12: Week 7. Learnability GRS LX 865 Topics in Linguistics

Navigating grammar Navigating grammar spacesspaces

Regardless of the approach, the idea is Regardless of the approach, the idea is that in the space of possible grammars, that in the space of possible grammars, there is a restricted set that correspond there is a restricted set that correspond to possible to possible humanhuman grammars. grammars.

Kids must in some sense navigate that Kids must in some sense navigate that space until they reach the grammar space until they reach the grammar that they’re hearing in the input data.that they’re hearing in the input data.

Page 13: Week 7. Learnability GRS LX 865 Topics in Linguistics

QuestionsQuestions

So how do they do it?So how do they do it? Where do they start?Where do they start? What kind of evidence do they need?What kind of evidence do they need? How much evidence do they need?How much evidence do they need?

Research on learnability in language Research on learnability in language acquisition has concentrated on acquisition has concentrated on these issues.these issues.

Page 14: Week 7. Learnability GRS LX 865 Topics in Linguistics

Are we there yet?Are we there yet?

There are a lot of grammars to choose from, There are a lot of grammars to choose from, even if UG limits them to some finite number.even if UG limits them to some finite number.

Kids have to try out many different grammars Kids have to try out many different grammars to see how well they fit what they’re hearing.to see how well they fit what they’re hearing.

We don’t want to require that kids remember We don’t want to require that kids remember everything they’ve ever heard, and sit there everything they’ve ever heard, and sit there and test their current grammar against the and test their current grammar against the whole corpus of utterances—that’ a lot to whole corpus of utterances—that’ a lot to remember.remember.

Page 15: Week 7. Learnability GRS LX 865 Topics in Linguistics

Are we there yet?Are we there yet?

We also want the kid, when they get We also want the kid, when they get to the right grammar, to stay there.to the right grammar, to stay there.

Error-driven learningError-driven learning Most theories of learnability rely on a Most theories of learnability rely on a

kind of error-detection.kind of error-detection. The kid hears something, it’s not The kid hears something, it’s not

generable by their grammar, so they generable by their grammar, so they have to switch their hypothesis, move to have to switch their hypothesis, move to a new grammar.a new grammar.

Page 16: Week 7. Learnability GRS LX 865 Topics in Linguistics

PlasticityPlasticity

Yet, particularly as the navigation Yet, particularly as the navigation progresses, we want them to be zeroing in progresses, we want them to be zeroing in on the right grammar.on the right grammar.

Finding an error doesn’t mean that you Finding an error doesn’t mean that you (as a kid) should jump to some random (as a kid) should jump to some random other grammar in the space.other grammar in the space.

Generally, you want to move to a nearby Generally, you want to move to a nearby grammar that improves your ability to grammar that improves your ability to generate the utterance you heard—move generate the utterance you heard—move in baby steps.in baby steps.

Page 17: Week 7. Learnability GRS LX 865 Topics in Linguistics

TriggersTriggers

Gibson & Wexler (1994) looked at Gibson & Wexler (1994) looked at learning word order in terms of three learning word order in terms of three parameters (head, spec, V2).parameters (head, spec, V2).

Their triggering learning algorithm Their triggering learning algorithm says if you hear something you can’t says if you hear something you can’t produce, try switching one parameter produce, try switching one parameter and see if it helps. If so, that’s your and see if it helps. If so, that’s your new grammar. Otherwise, stick with new grammar. Otherwise, stick with the old grammar and hope you’ll get the old grammar and hope you’ll get a better example.a better example.

Page 18: Week 7. Learnability GRS LX 865 Topics in Linguistics

Local maximaLocal maxima

A problem they encountered is that there A problem they encountered is that there are certain places in the grammar space are certain places in the grammar space where you end up more than one switch where you end up more than one switch away from a grammar that will produce away from a grammar that will produce what you hear.what you hear.

This is locally as good as it gets—nothing This is locally as good as it gets—nothing next to it in the grammar space is better—next to it in the grammar space is better—yet if you consider the whole grammar yet if you consider the whole grammar space, there is a better fit somewhere else, space, there is a better fit somewhere else, you just can’t get there with baby steps.you just can’t get there with baby steps.

Page 19: Week 7. Learnability GRS LX 865 Topics in Linguistics

Local maximaLocal maxima

This is a point where any move you This is a point where any move you make is worse, so a conservative make is worse, so a conservative algorithm will never get you to the algorithm will never get you to the best place.best place.

Page 20: Week 7. Learnability GRS LX 865 Topics in Linguistics

Children vs. OTChildren vs. OT

Optimality Theory is a theory of ranked Optimality Theory is a theory of ranked constraints, some of which are in direct constraints, some of which are in direct contradiction.contradiction.

Standard simple example from phonology:Standard simple example from phonology: Onset: syllables start with a consonantOnset: syllables start with a consonant NoCoda: syllables don’t end with a consonantNoCoda: syllables don’t end with a consonant Max: say what you mean (say everything in the Max: say what you mean (say everything in the

input)input) Dep: say only what you mean (don’t add Dep: say only what you mean (don’t add

anything to the input).anything to the input).

Page 21: Week 7. Learnability GRS LX 865 Topics in Linguistics

baba

If you want to say If you want to say baba (if the word (if the word you’re trying to say looks like /ba/ in you’re trying to say looks like /ba/ in the lexicon), you can say the lexicon), you can say baba and and satisfy all the constraints.satisfy all the constraints. BaBa starts with a consonant (√Onset) starts with a consonant (√Onset) Ba Ba ends in a vowel (√NoCoda)ends in a vowel (√NoCoda) BaBa has all of the input sounds (√Max) has all of the input sounds (√Max) BaBa has no new sounds (√Dep) has no new sounds (√Dep)

Page 22: Week 7. Learnability GRS LX 865 Topics in Linguistics

batbat

But if the word you want to say is /bat/, But if the word you want to say is /bat/, there’s a problem.there’s a problem. Say Say batbat and you satisfy Max, Dep, and Onset, and you satisfy Max, Dep, and Onset,

but you violate NoCoda (it ends in a consonant).but you violate NoCoda (it ends in a consonant). Say Say baba and you satisfy Dep, Onset, and NoCoda, and you satisfy Dep, Onset, and NoCoda,

but you violate Max (you left out the /t/).but you violate Max (you left out the /t/). Languages make different choices about Languages make different choices about

which wins, so kids have to decide: Is which wins, so kids have to decide: Is NoCoda more important than Max or vice-NoCoda more important than Max or vice-versa?versa?

Page 23: Week 7. Learnability GRS LX 865 Topics in Linguistics

atat Similarly, /at/ results in these options:Similarly, /at/ results in these options:

Say Say atat, satisfying Max and Dep, at the expense , satisfying Max and Dep, at the expense of Onset and NoCoda.of Onset and NoCoda.

Say Say aa, satisfying Dep and NoCoda, at the , satisfying Dep and NoCoda, at the expense of Onset and Max.expense of Onset and Max.

Say Say tata, satisfying Onset and NoCoda, at the , satisfying Onset and NoCoda, at the expense of Max and Dep.expense of Max and Dep.

Say Say tattat, satisfying Max and Onset, at the , satisfying Max and Onset, at the expense of Dep and NoCoda.expense of Dep and NoCoda.

Which constraint is more important in the Which constraint is more important in the language determines the output.language determines the output.

Page 24: Week 7. Learnability GRS LX 865 Topics in Linguistics

TableauTableau

/at//at/ MaxMax DepDep OnsetOnset NoCodaNoCoda

[at][at] ** **

[a][a] ** **

[ta][ta] ** **

[tat][tat] ** **

Page 25: Week 7. Learnability GRS LX 865 Topics in Linguistics

Max,Dep>>Ons,NoCodaMax,Dep>>Ons,NoCoda

/at//at/ MaxMax DepDep OnsetOnset NoCodaNoCoda

[at][at] ** **

[a][a] *!*! **

[ta][ta] *!*! **

[tat][tat] *!*! **

Page 26: Week 7. Learnability GRS LX 865 Topics in Linguistics

Max,Ons>>Dep,NoCodaMax,Ons>>Dep,NoCoda

/at//at/ MaxMax OnsetOnset DepDep NoCodaNoCoda

[at][at] *!*! **

[a][a] *!*! **

[ta][ta] *!*! **

[tat][tat] ** **

Page 27: Week 7. Learnability GRS LX 865 Topics in Linguistics

NoCoda,Ons>>Dep,MaxNoCoda,Ons>>Dep,Max

/at//at/ NoCodaNoCoda OnsetOnset DepDep MaxMax

[at][at] *!*! **

[a][a] *!*! **

[ta][ta] ** **

[tat][tat] *!*! **

Page 28: Week 7. Learnability GRS LX 865 Topics in Linguistics

NoCoda,Dep>>Ons,MaxNoCoda,Dep>>Ons,Max

/at//at/ NoCodaNoCoda DepDep OnsetOnset MaxMax

[at][at] *!*! **

[a][a] ** **

[ta][ta] *!*! **

[tat][tat] *!*! **

Page 29: Week 7. Learnability GRS LX 865 Topics in Linguistics

4 constraints, 24 4 constraints, 24 rankingsrankings

Max, Dep, Ons, and NoCoda have hardly Max, Dep, Ons, and NoCoda have hardly exhausted the systematic knowledge we have exhausted the systematic knowledge we have about phonology.about phonology.

The are The are lotslots more constraints, but every new more constraints, but every new constraint we add can in principle be ranked constraint we add can in principle be ranked between every two constraints we had between every two constraints we had before.before. 4 constraints, 24 = 4x3x2 = 4! rankings4 constraints, 24 = 4x3x2 = 4! rankings 5 constraints, 120 = 5x4x3x2 = 5! Rankings5 constraints, 120 = 5x4x3x2 = 5! Rankings 20 constraints?20 constraints? 30 constraints?30 constraints?

Page 30: Week 7. Learnability GRS LX 865 Topics in Linguistics

Wide open spacesWide open spaces

The grammar space that the child has to The grammar space that the child has to navigate if OT is the right model of navigate if OT is the right model of grammar is grammar is vastvast..

SubhierarchiesSubhierarchies There are some constraints which seem to be There are some constraints which seem to be

fixed in relative ranking to other constraints, fixed in relative ranking to other constraints, cutting down the space a little bit.cutting down the space a little bit.

*[I-onset] >> *[b-onset] >> *[t-onset]*[I-onset] >> *[b-onset] >> *[t-onset] But that still leaves a lot of optionsBut that still leaves a lot of options

Page 31: Week 7. Learnability GRS LX 865 Topics in Linguistics

Constraint DemotionConstraint Demotion

Tesar & SmolenskyTesar & Smolensky If kid hears [tap] for /tap/ but would If kid hears [tap] for /tap/ but would

have pronounced it [ta], there’s a have pronounced it [ta], there’s a problem—the kid needs to move to a problem—the kid needs to move to a new grammar. The constraints must be new grammar. The constraints must be reordered so that [tap] comes out.reordered so that [tap] comes out.

Diagnosis: Kid’s got NoCoda >> Max, Diagnosis: Kid’s got NoCoda >> Max, but needs to have Max >> NoCodabut needs to have Max >> NoCoda

Page 32: Week 7. Learnability GRS LX 865 Topics in Linguistics

NoCoda>>Ons>>Dep>>NoCoda>>Ons>>Dep>>MaxMax

/tap//tap/ NoCodaNoCoda OnsetOnset DepDep MaxMax

[tap][tap] *!*!

[ta][ta] **

[a][a] *!*! ****

[ap][ap] *!*! ** **

Page 33: Week 7. Learnability GRS LX 865 Topics in Linguistics

Constraint demotionConstraint demotion

We demote the constraint that’s We demote the constraint that’s killing the correct candidate to a killing the correct candidate to a point in the ranking below the point in the ranking below the constraint that would kill the constraint that would kill the incorrect candidate.incorrect candidate.

We re-rank NoCoda to below Max, We re-rank NoCoda to below Max, and solve the problem.and solve the problem.

Page 34: Week 7. Learnability GRS LX 865 Topics in Linguistics

Ons>>Dep>>Max>>NoCOns>>Dep>>Max>>NoCodaoda

/tap//tap/ OnsetOnset DepDep MaxMax NoCodaNoCoda

[tap][tap] *!*!

[ta][ta] *!*!

[a][a] *!*! ****

[ap][ap] *!*! ** **

Page 35: Week 7. Learnability GRS LX 865 Topics in Linguistics

Repeat, repeat, repeatRepeat, repeat, repeat

Eventually, if you do this long Eventually, if you do this long enough, Tesar & Smolensky argue, enough, Tesar & Smolensky argue, you’ll reach the adult ranking.you’ll reach the adult ranking. (or something equivalent)(or something equivalent)

Along the way, kids have Along the way, kids have intermediate grammars (possible intermediate grammars (possible human grammars, but not the target).human grammars, but not the target).

Page 36: Week 7. Learnability GRS LX 865 Topics in Linguistics

M vs. FM vs. F

Constraints come in two flavors Constraints come in two flavors generally, those that say “say exactly generally, those that say “say exactly what you mean” (Faithfulness—make what you mean” (Faithfulness—make the output look like the input) and the output look like the input) and those that say “conform to the those that say “conform to the admissible shapes for language admissible shapes for language ouput” (Markedness).ouput” (Markedness). Max, Dep = FaithfulnessMax, Dep = Faithfulness Ons, NoCoda = MarkednessOns, NoCoda = Markedness

Page 37: Week 7. Learnability GRS LX 865 Topics in Linguistics

M >> FM >> F

Kids’ syllables are generally of the Kids’ syllables are generally of the baba sort (and not of the sort (and not of the strengthsstrengths sort), sort), suggesting that initially the suggesting that initially the Markedness constraints are Markedness constraints are outranking the Faithfulness outranking the Faithfulness constraints, and then re-ranking brings constraints, and then re-ranking brings them more in line with the adults.them more in line with the adults. (The idea is that they may try to say (The idea is that they may try to say

/strengths/, but it comes out like [te] at /strengths/, but it comes out like [te] at first)first)

Page 38: Week 7. Learnability GRS LX 865 Topics in Linguistics

Wait a minute, this is Wait a minute, this is crazycrazy

One thing that learnability research of this One thing that learnability research of this sort tends to leave mostly undiscussed is how sort tends to leave mostly undiscussed is how the kid comes to hypothesize /strength/ as the the kid comes to hypothesize /strength/ as the underlying form in the first place. After all, if underlying form in the first place. After all, if it sounds like [te], why not suppose it is /te/?it sounds like [te], why not suppose it is /te/?

Be that as it may. A clear place where more Be that as it may. A clear place where more work is needed (recent work by Tesar and work is needed (recent work by Tesar and Smolensky separately make some attempts).Smolensky separately make some attempts).

For now, we assume the kid knows the word.For now, we assume the kid knows the word.

Page 39: Week 7. Learnability GRS LX 865 Topics in Linguistics

OptionalityOptionality Another issue arises if the grammar allows Another issue arises if the grammar allows

true optionality—this seems possible in true optionality—this seems possible in phonology at least (Arto Anttila and Bill phonology at least (Arto Anttila and Bill Reynolds have done a lot of work in this Reynolds have done a lot of work in this domain, with several others since).domain, with several others since).

If the adult allows two different surface If the adult allows two different surface realizations of the same input, the realizations of the same input, the constraint demotion algorithm is going to constraint demotion algorithm is going to eternally flip-flop between rankings.eternally flip-flop between rankings.

The constraint demotion algorithm assumes The constraint demotion algorithm assumes strictstrict ranking, one output per input. ranking, one output per input.

Page 40: Week 7. Learnability GRS LX 865 Topics in Linguistics

One approach: multiple One approach: multiple grammarsgrammars

One way to look at this is as taking One way to look at this is as taking people to have multiple grammars.people to have multiple grammars.

Plausible; registers, for example, or Plausible; registers, for example, or multilingualism, or dialects.multilingualism, or dialects.

(Of course, how would the kid decide (Of course, how would the kid decide which grammar to demote which grammar to demote constraints in?)constraints in?)

Page 41: Week 7. Learnability GRS LX 865 Topics in Linguistics

Multiple grammars vs. Multiple grammars vs. DarwinDarwin

One approach (e.g., Charles Yang) says that One approach (e.g., Charles Yang) says that kids maintain several grammars and test kids maintain several grammars and test them all at once, the ones which do a worse them all at once, the ones which do a worse job being demoted to less frequent use.job being demoted to less frequent use.

You have several grammars you choose You have several grammars you choose between, each with a probability of choosing between, each with a probability of choosing it, and if a grammar never seems to be it, and if a grammar never seems to be predicting the right forms, it will become less predicting the right forms, it will become less and less likely to be used in the future. and less likely to be used in the future. Acquisition in the end lands on one (or a Acquisition in the end lands on one (or a couple, if there is free variation).couple, if there is free variation).

Page 42: Week 7. Learnability GRS LX 865 Topics in Linguistics

Another approach: Another approach: BoersmaBoersma

Paul Boersma (and more recently Paul Boersma (and more recently Bruce Hayes) have been championing Bruce Hayes) have been championing a more statistical approach.a more statistical approach.

Constraints have a position in a Constraints have a position in a ranking (associated with a number), ranking (associated with a number), but when you do an evaluation, the but when you do an evaluation, the number it’s actually evaluated at is number it’s actually evaluated at is governed by a normal (“bell curve”) governed by a normal (“bell curve”) distribution—there’s noise in the distribution—there’s noise in the system.system.

Page 43: Week 7. Learnability GRS LX 865 Topics in Linguistics

BoersmaBoersma

This is a grammar where A usually This is a grammar where A usually >> B, but sometimes B>>A.>> B, but sometimes B>>A.

You can compute exactly how often You can compute exactly how often B>>A.B>>A.

This can get adult variation, and This can get adult variation, and provides a way for kids to learn the provides a way for kids to learn the rankings too.rankings too.

A B

Page 44: Week 7. Learnability GRS LX 865 Topics in Linguistics

GLAGLA Boersma’s gradual learning algorithm is Boersma’s gradual learning algorithm is

sensitive to the frequencies in the input, and sensitive to the frequencies in the input, and will move the centers of the constraints up will move the centers of the constraints up or down in order to try to match the or down in order to try to match the frequency of A>>B with respect to B>>A. frequency of A>>B with respect to B>>A. Works similarly to constraint demotion Works similarly to constraint demotion otherwise.otherwise.

Advantages:Advantages: Gets statistical properties of the inputGets statistical properties of the input More resilient in the face of ungrammatical noise.More resilient in the face of ungrammatical noise. Fancy simulation program (Praat).Fancy simulation program (Praat).

Page 45: Week 7. Learnability GRS LX 865 Topics in Linguistics

NoCoda>>Ons>>Dep>>NoCoda>>Ons>>Dep>>MaxMax

/tap//tap/ NoCodaNoCoda OnsetOnset DepDep MaxMax

[tap][tap] *!*!

[ta][ta] **

[a][a] *!*! ****

[ap][ap] *!*! ** **

Page 46: Week 7. Learnability GRS LX 865 Topics in Linguistics

PraatPraat

Praat is Boersma’s super-program, that Praat is Boersma’s super-program, that either already does or in the future will either already does or in the future will do everything a do everything a phonologist/phonetician could ever phonologist/phonetician could ever want.want.

For the moment, our concern is that it For the moment, our concern is that it can do learning simulations on datasets can do learning simulations on datasets to try to reach an adult grammar.to try to reach an adult grammar.

Page 47: Week 7. Learnability GRS LX 865 Topics in Linguistics

Legendre et al.Legendre et al. The system we discussed for French last The system we discussed for French last

time (*F, *F2, ParseT, ParseA) is a slightly time (*F, *F2, ParseT, ParseA) is a slightly different system again (based on work by different system again (based on work by Reynolds, related to work by Anttila).Reynolds, related to work by Anttila).

Under that system, learning proceeds by Under that system, learning proceeds by starting with M>>F, and promoting starting with M>>F, and promoting Faithfulness constraints, but not in an all-Faithfulness constraints, but not in an all-or-nothing manner—rather, constraints are or-nothing manner—rather, constraints are promoted such that they span a range of promoted such that they span a range of the constraint hierarchy. This also yields the constraint hierarchy. This also yields percentages. It’s a form of multiple percentages. It’s a form of multiple grammars, but very grammars, but very relatedrelated grammars. grammars.

Page 48: Week 7. Learnability GRS LX 865 Topics in Linguistics

PraatPraat We’re going to try out Praat’s learning We’re going to try out Praat’s learning

capabilities.capabilities. Praat comes with some built-in grammars, Praat comes with some built-in grammars,

for example the NoCoda grammar.for example the NoCoda grammar. Each Praat grammar listsEach Praat grammar lists

The constraints and their positionThe constraints and their position Any “fixed rankings” (e.g. *FAny “fixed rankings” (e.g. *F22 >> *F). >> *F). The possible inputsThe possible inputs The candidate outputsThe candidate outputs The constraints each candidate violatesThe constraints each candidate violates

Page 49: Week 7. Learnability GRS LX 865 Topics in Linguistics

Seeing what Praat Seeing what Praat grammars can dogrammars can do

To learn a grammar (that is, given the To learn a grammar (that is, given the constraints in the grammar, but set at constraints in the grammar, but set at some random place in the hierarchy some random place in the hierarchy initially), the learner needs data.initially), the learner needs data.

You can write You can write pair distributionspair distributions, which , which say how frequently /form/ comes out as say how frequently /form/ comes out as [form] and as [for], for example.[form] and as [for], for example.

You can use these distributions to create a You can use these distributions to create a corpus (input strings) which the learner corpus (input strings) which the learner will process to try to set the ranking.will process to try to set the ranking.

Page 50: Week 7. Learnability GRS LX 865 Topics in Linguistics

Finnish pluralsFinnish plurals

Anttila’s data.Anttila’s data.

Page 51: Week 7. Learnability GRS LX 865 Topics in Linguistics