1401 A Marie Mount Hall
Comment on “Nonadjacent dependency processing in monkeys, apes, and humans”
Auditory pattern recognition in nonhuman animals shares important characteristics with human phonology, but not human syntax.
We comment on the technical interpretation of the study of Watson et al. and caution against their conclusion that the behavioral evidence in their experiments points to nonhuman animals’ ability to learn syntactic dependencies, because their results are also consistent with the learning of phonological dependencies in human languages.
Social inference may guide early lexical learning
Assessment of knowledgeability and group membership influences infant word learning.
We incorporate social reasoning about groups of informants into a model of word learning, and show that the model accounts for infant looking behavior in tasks of both word learning and recognition. Simulation 1 models an experiment where 16-month-old infants saw familiar objects labeled either correctly or incorrectly, by either adults or audio talkers. Simulation 2 reinterprets puzzling data from the Switch task, an audiovisual habituation procedure wherein infants are tested on familiarized associations between novel objects and labels. Eight-month-olds outperform 14-month-olds on the Switch task when required to distinguish labels that are minimal pairs (e.g., “buk” and “puk”), but 14-month-olds' performance is improved by habituation stimuli featuring multiple talkers. Our modeling results support the hypothesis that beliefs about knowledgeability and group membership guide infant looking behavior in both tasks. These results show that social and linguistic development interact in non-trivial ways, and that social categorization findings in developmental psychology could have substantial implications for understanding linguistic development in realistic settings where talkers vary according to observable features correlated with social groupings, including linguistic, ethnic, and gendered groups.
Computational phonology today
Bill Idsardi and Jeff Heinz highlight important aspects of today's computational phonology.
Phonemes: Lexical access and beyond
A defense of the central role of phonemes in phonology, contrary to the current mainstream.
Categorical effects in fricative perception are reflected in cortical source information
Phonetic discrimination is affected by phonological category more for consonants than it is for vowels. But what about fricatives in particular? Sol Lago and collaborators provide evidence from ERF and MEG.
What Complexity Differences Reveal About Domains in Language
Do humans learn phonology differently than they do syntax? Yes, argue Bill Idsardi and Jeff Heinz, as this is the best explanation for why phonological but not syntactic patterns all belong to the regular region of the Chomsky Hierarchy.
A single stage approach to learning phonological categories: Insights from Inuktitut
Much research presumes that we acquire phonetic categories before abstracting phonological categories. Ewan Dunbar argues that this two-step progression is unnecessary, with a Bayesian model for the acquisition of Inuktitut vowels.
Sentence and Word Complexity
Do we learn different kinds of linguistic structure differently?
A Comprehensive Three-dimensional Cortical Map of Vowel Space
Postdoc Mathias Scharinger and collaborators use the magnetic N1 (M100) to map the entire vowel space of Turkish onto cortical locations in the brain. They find two distinct tonotopic maps, one for front vowels and one for back.
You had me at "Hello": Rapid extraction of dialect information from spoken words
MEG studies show that we detect acoustic features of dialect speaker-independently, pre-attentively and categorically, within 100 milliseconds.