Major reference: Bloom, P. (1994). Recent controversies
in the study of language acquisition. In M. A. Gernsbacher (Ed.), Handbook
of psycholinguistics (pp. 741-779). San Diego, CA: Academic Press.
Recommended: Pinker, S. (1991). Rules of language. Science, 253, 530-535.
MacWhinney, B. (1998). Models of the emergence of language.
Annual Review of Psychology, 49, 199-227.
Significance of language acquisition:
Theories of mind: Aristotle, Plato, Skinner, Chomsky, etc.
Nativism vs. Learning
Unique human accomplishment?
Perception: High amplitude sucking (detection novel stimuli; e.g., phonological changes)
Conditioned head turn procedure; condition to turn when new sound presented
Production: Naturalistic samples (R. Brown; Adam, Eve, Sarah)
CHILDES data base (http://poppy.psy.cmu.edu/childes/index.html)
Elicited speech (This is a wug; Now there are two _____)
What do they know and when do they know it?
Prenatal: Newborn preference for mother's voice
Newborn preference for previously read story (prosodic familiarity)
Phonetic perception and segmentation: 2 months
discriminate among sounds in all natural languages (language specific capacity
or general auditory perception ability; at 6-12 months discriminative ability
limited to native language
infants prefer speech with prosodic characteristics of native language
10 months: prefer words with common native stress patterns
(becoming attuned to patterns allowing them to segment phonemes, morphemes)
Babbling: 7-10 months - reduplicative babbling
(sounds vary as function of home language)
Deaf children babble (language ability geared to acquire abstract linguistic structure not just speech)
Lexical production: 10-12 months first words (?) produced (excluding mama, dada (easier)
Earliest words (universal): names (specific individuals, objects, substances). Also, action verbs and adjectives. Mental state verbs later.
18 months: word spurt (dramatic increase in acquisition/use
Simultaneously, word combos (2 words) appear. Two events (lexical and syntactic) are correlated
telegraphic speech; short utterances with function words/morphemes deleted (pickup; more juice; hug now)
word order sensitivity (Big Bird tickles Cookie Monster or vice versa); sensitivity occurs for children whose native language has a free word order
2 ½ years: 2 word stage ends; utterance length
increases, function words appear, more advanced syntax (e.g., relative
Input to language development
Input required during critical periods (feral children).
But how much and in what form?
Motherese (child directed speech) - slow, high-pitched,
- preferred by infants
- aids social bonding
- aids development of child's parsing ability
- reliable source of positive evidence for child
But negative evidence no effect; no correlation between
child's grammaticality and parental disapproval
- Western parent-child interaction not norm worldwide
- Development of grammar in absence of target exposure (home sign)
Acquisition of Word Meaning
Theory of word acquisition may entail theory of mind (what it means to know a concept)
Gavagai problem (Quine, 1960); There goes a dog
Which word is relevant?
Motherese, emphasis, parent-child interaction important
Meaning of word? (e.g., dog)
Hypothesis testing procedure: Child forms hypothesis
But, infinite number of hypotheses
(Assume word references who object and not its parts)
(labels refer to objects of same kind, not thematically related (e.g., dog - bone) - Can you find a dax? (Dog) Vs. Can you find another? (Bone)
Objects have only one label (point to the fendle (novel word) with familiar and unfamiliar objects; choose unfamiliar)
Vs. Principle of contrast (E. Clark) - one label because children possess contrast principle; different words exist because they have different meanings
Problems with constraints:
Applies largely to names; what about verbs?
Bias not absolute constraint
Constraints don't specify how children know what adult is talking about in the first place.
Can supplement with child's innate 'theory of mind' (assumptions regarding adult intentionality; social basis of acquisition)
Brown (1957); classic study; syntactic bootstrapping - use syntax to infer word meaning
Show children (preschool) picture unfamiliar action. Then:
"In this picture you can see nissing (action)
"In this picture you can see some niss (mass noun)
"In this picture you can see a niss (count noun)
Ask children select another picture showing nissing (choose action; know verb), some niss (choose material; know mass noun) a niss (choose tool; know count noun)
Acquisition of syntax
Little negative evidence given to children; So, how is
Proposals differ in terms of how telegraphic speech characterized
Discontinuous; telegraphic speech - child does
not possess a grammar; via revisions and extensions a grammar is acquired.
Example: children characterize words based on distributional properties (e.g., 'a' goes before 'dog') and then cluster words/phrases into categories that eventually become syntactic categories (nouns, etc.)
Problem: number of possible combinations is enormous (no rules) (similar to behaviorist problem)
Continuous; telegraphic speech - child possesses
grammar; initial syntactic categories built into brain; child must connect
syntactic categories with words.
Example: semantic bootstrapping (Pinker) - child uses semantic properties to acquire syntax. Part of LAD - map concepts onto syntax (e.g., names of objects (semantic feature) are count nouns)
Example: dog = solid object = count noun
acquire determiner 'a'
a dog -> count noun follows determiner
a problem (problem is not an object) = count noun
But, syntactic and semantic bootstrapping? Not for same word.
Possible to work in conjunction
Learn meaning few words -> syntactic categories -> use
syntactic categories to acquire new words.
Parameter setting - Chomsky's approach to syntax acquisition
Basic idea: cross-linguistic differences based on limited variation on universal principles
Input triggers (sets parameter) innately encoded grammar (very nativist approach)
Null subject parameter (most heavily investigated)
Languages vary in subject optionality; whether subject
is required; also, whether expletive subject allowed; "It's raining out";
Pro-drop languages (subject optional) don't allow expletive subject)
Hymans; Original default setting = optional subject allowed
English hear evidence that subject is not optional and reset parameter
Problem: if original setting is subject optional (subject can be produced but need not be), how does child know to set parameter to subject required (given that subject can be produced in pro-drop languages)
Expletive subject; if occurs it provides positive evidence (pro-drop don't allow it)
Alternative proposal (Bloom)
Subset principle: original setting is for language that
is a subset (e.g.,nonpro-drop is subset of pro-drop)
Evidence for superset can then trigger shift (if original setting was superset, what would trigger shift for English?)
Explanation can be applied to word order:
Original setting - fixed word order (subset); evidence for free order (superset) triggers shift
Verb tense errors; implications for acquisition
Phenomena: (English) regular verbs - past tense by adding 'ed'; a productive rule - past tense forms can be generated for new words; but approximately 180 irregular verbs (violate 'ed' rule)
Child time course is U-shaped curve: perfect performance early (both regular and irregular), then performance degrades - overgeneralization of rule (go-goed), then performance returns
One explanation (Pinker):
Child acquires all past tense forms via rote memory (no
rule) and hence generally perfect performance
Rule (ed) is induced and used; overapplication leads to overgeneralization (error)
Irregular verbs are reintroduced into lexicon
Blocking rule (rule blocked if item - irregular verb - already exists in lexicon)
Error- free performance
Why overgeneralize in first place (initial irregular form should block)
Can't hand childs production of both irregular and overgeneralization forms in same conversation
Alternative conceptualization (PDP; Rumelhart & McClelland, 1986)
Past tense forms acquired through associative network
extracting statistical regularities from environment.
No rules; system trained on correspondences between phonological patterns of verb stems and past tense forms
Verb stem represented by represented by set of input nodes (each corresponding to sound pattern in stem)
This sends signals across links to output nodes (sound of past tense form); if sum > threshold then output that form
Learning phase: links/thresholds adjusted based on feedback
Note: Process is same for regular and irregular verbs (no distinction between them):
E.g.,: stop->stopped because op input linked to opped output
cling->clung because ing input linked to ung output
Input irregular forms (they're more frequent) heavily
Model parallels U-shaped curve
initial input of irregular verbs may not match child's input (equal numbers of irregular and regular verbs)
lexicon not included in model (all phonological); but
tense based on lexical not phonological features (e.g., ring/wring sound
same but past tense is rang/wrung). Also, past tense of verbs derived from
nouns is regular even if similar-sounding irregular verb exists (e.g.,
flied out/ not flew out).
regular past tense produced by rule (symbolic); irregular
stored associative memory (PDP)
irregular form blocks regular rule
overgeneralization occurs because child's memory trace not strong enough to guarantee perfect retrieval, and retrieval failure results in rule application (and hence overgeneralization)
(Note: reverse dissociation occurs for those with intact memory but grammatical impairment - agrammatic aphasia: performance with irregular is fine (memory system intact) but regular verb production problematic