• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/20

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

20 Cards in this Set

  • Front
  • Back
Chomsky's "Three Models of language"
1) Markov models
2) phrase structure grammars
3) transformations

-discusses mathematics of syntax (unlike Markov models which predict letters and words from letters and words behind them, his models are context-free and depend on syntax, the parts of the sentence)

-language acquisition (we have innate constraints, a universal grammar that is shared over every language
linguistic competence vs. performance
the knowledge that one possesses of a language vs. the way they use it
generative grammar
a set of rules that provide grammatically correct sentences.
for English: S --> NP VP, VP --> V AP, AP --> A
universal grammar
rules that are shared over any language spoken by a human--theory for the genetic component of language faculty
Chomsky's idea of language
a set of grammatically correct sentences that are of finite length and are constructed out of a finite set of elements
syntax's role in cogsci
to model a system of knowledge that determines which utterances constitute a certain language, and to understand where this knowledge comes from
phrase structure grammar
a grammar able to capture arbitrarily long dependencies using recursion and a fixed number of constituents
transformations
moving words around in a sentence to change meanings.

-Wh-movement: puts a 'what' or 'where' in front of the sentence to specify that a noun is the subject. this takes us from a yes/no question to a Wh-question
Connectionist approach to language
rather than language being governed by a set of rules, words are points in a distributed representational space and grammatical sentences are paths through that space
stages of language acquisition
6 mos.: babbling (identifiable phonemes, will eventually converge into native language)
one yr.: single word utterances. over and underextension is common
two yrs.: two-word utterances made of a very simple grammar
over/underextension
when a child uses a word with a broader/narrower meaning than it has in adult language.
e.g. 'dog' for any four-legged animal / 'dog' is only their dog, not any dog in the park
overregularization and u-shapred learning
the act of applying a rule to an exception to the rule.
e.g. children know that 'went' is PT of 'go', but at a certain point, they may say 'goed'
three factors of language acquisition
1. genetic components of language faculty
2. genetic components of general learning mechanisms
3. linguistic experience
poverty of the stimulus
the rules for a language are far too complex and children could never have been exposed to all of these. also, there are an infinite amount of sentences. therefore, there must be a genetic mechanism for language.
"logical problem" of language acquisition
-children receive mostly positive evidence (grammatically correct sentences), rather than examples of sentences that are labeled 'grammatical' or 'ungrammatical'.
-if target language is a subset of hypothesis, then no positive evidence can rule out mistakes
poverty of the stimulus rebuttal
assumption: ideas relies on the fact that there are too many rules to learn.
- are there actually that many rules?
- are there both powerful ways to learn a language?
empiricism in language acquisition (arguments against a universal grammar)
- neural networks (connectionism)
- probability helps explain aspects of language learning
- parts of language that Chomsky believes innate are actually emphasized more to children and are more common (e.g. Motherese exaggerates linguistics features --> facilitates learning)
linguistics' role in cogsci
language represents human thought in a comprehensible way, so we are given insight into thought processes
generative grammars and ambiguity
when we make a generative grammar tree, we're able to understand where the ambiguity comes and what syntax can be manipulated without changing meaning
e.g. the people talked over the noise
attractive properties of grammars
- infinite ends from finite means: we're able to make an infinite number of sentences from a few rules
- looking at trees helps us find meanings in sentences. we can get through ambiguity this way, sometimes