Skip to main content
Skip to main content

Research

Research at our top-ranked department spans syntax, semantics, phonology, language acquisition, computational linguistics, psycholinguistics and neurolinguistics. 

Connections between our core competencies are strong, with theoretical, experimental and computational work typically pursued in tandem.

A network of collaboration at all levels sustains a research climate that is both vigorous and friendly. Here new ideas develop in conversation, stimulated by the steady activity of our labs and research groups, frequent student meetings with faculty, regular talks by local and invited scholars and collaborations with the broader University of Maryland language science community, the largest and most integrated language science research community in North America.

Show activities matching...

filter by...

Learning, memory and syntactic bootstrapping: A meditation

Do children learning words rely on memories for where they have heard the word before? Jeff Lidz argues memory of syntactic context plays a larger role than memory for referential context.

Linguistics

Contributor(s): Jeffrey Lidz
Dates:
Lila Gleitman’s body of work on word learning raises an apparent paradox. Whereas work on syntactic bootstrapping depends on learners retaining information about the set of distributional contexts that a word occurs in, work on identifying a word’s referent suggests that learners do not retain information about the set of extralinguistic contexts that a word occurs in. I argue that this asymmetry derives from the architecture of the language faculty. Learners expect words with similar meanings to have similar distributions, and so learning depends on a memory for syntactic environments. The referential context in which a word is used is less constrained and hence contributes less to the memories that drive word learning.

Read More about Learning, memory and syntactic bootstrapping: A meditation

Antecedent access mechanisms in pronoun processing: Evidence from the N400

Lexical decisions to a word after a pronoun are facilitated when it is semantically related to the pronoun’s antecedent. These priming effects may depend not on automatic spreading activation, but on the extent to which the relevant word is predicted.

Linguistics

Contributor(s): Ellen Lau
Non-ARHU Contributor(s):

Sol Lago (*14), Anna Namyst, Lena Jäger

Dates:

Previous cross-modal priming studies showed that lexical decisions to words after a pronoun were facilitated when these words were semantically related to the pronoun’s antecedent. These studies suggested that semantic priming effectively measured antecedent retrieval during coreference. We examined whether these effects extended to implicit reading comprehension using the N400 response. The results of three experiments did not yield strong evidence of semantic facilitation due to coreference. Further, the comparison with two additional experiments showed that N400 facilitation effects were reduced in sentences (vs. word pair paradigms) and were modulated by the case morphology of the prime word. We propose that priming effects in cross-modal experiments may have resulted from task-related strategies. More generally, the impact of sentence context and morphological information on priming effects suggests that they may depend on the extent to which the upcoming input is predicted, rather than automatic spreading activation between semantically related words.

Read More about Antecedent access mechanisms in pronoun processing: Evidence from the N400

Same words, different structures: An fMRI investigation of argument relations and the angular gyrus

fMRI research has implicated the angular gyrus of the left hemisphere in the computation of event concepts. Might its role be more specifically the computation of argument structure, a specifically linguistic relation?

Linguistics

Non-ARHU Contributor(s): William Matchin
Dates:
In fMRI, increased activation for combinatorial syntactic and semantic processing is typically observed in a set of left hemisphere brain areas: the angular gyrus (AG), the anterior temporal lobe (ATL), the posterior superior temporal sulcus (pSTS), and the inferior frontal gyrus (IFG). Recent work has suggested that semantic combination is supported by the ATL and the AG, with a division of labor in which AG is involved in event concepts and ATL is involved in encoding conceptual features of entities and/or more general forms of semantic combination. The current fMRI study was designed to refine hypotheses about the angular gyrus processes in question. In particular, we ask whether the AG supports the computation of argument structure (a linguistic notion that depends on a verb taking other phrases as arguments) or the computation of event concepts more broadly. To distinguish these possibilities we used a novel, lexically-matched contrast: noun phrases (NP) (the frightened boy) and verb phrases (VP) (frightened the boy), where VPs contained argument structure, denoting an event and assigning a thematic role to its argument, and NPs did not, denoting only a semantically enriched entity. Results showed that while many regions showed increased activity for NPs and VPs relative to unstructured word lists (AG, ATL, pSTS, anterior IFG), replicating evidence of their involvement in combinatorial processing, neither AG or ATL showed differences in activation between the VP and NP conditions. These results suggest that increased AG activity does not reflect the computation of argument structure per se, but are compatible with a view in which the AG represents event information denoted by words such as frightened independent of their grammatical context. By contrast, pSTS and posterior IFG did show increased activation for the VPs relative to NPs. We suggest that these effects may reflect differences in syntactic processing and working memory engaged by different structural relations.

Read More about Same words, different structures: An fMRI investigation of argument relations and the angular gyrus

Prosody and Function Words Cue the Acquisition of Word Meanings in 18-Month-Old Infants

18-month-old infants use prosody and function words to recover the syntactic structure of a sentence, which in turn constrains the possible meanings of novel words the sentence contains.

Linguistics

Contributor(s): Jeffrey Lidz
Non-ARHU Contributor(s):

Angela Xiaoxue He (*15), Alex de Carvalho, Anne Christophe

Dates:

Language acquisition presents a formidable task for infants, for whom word learning is a crucial yet challenging step. Syntax (the rules for combining words into sentences) has been robustly shown to be a cue to word meaning. But how can infants access syntactic information when they are still acquiring the meanings of words? We investigated the contribution of two cues that may help infants break into the syntax and give a boost to their lexical acquisition: phrasal prosody (speech melody) and function words, both of which are accessible early in life and correlate with syntactic structure in the world’s languages. We show that 18-month-old infants use prosody and function words to recover sentences’ syntactic structure, which in turn constrains the possible meanings of novel words: Participants (N = 48 in each of two experiments) interpreted a novel word as referring to either an object or an action, given its position within the prosodic-syntactic structure of sentences.

Read More about Prosody and Function Words Cue the Acquisition of Word Meanings in 18-Month-Old Infants

What the PCC tells us about “abstract” agreement, head movement, and locality

Agreement in Person, Number or Noun Class features is always overtly realized, in some part of the paradigm, and is never fully "abstract".

Linguistics

Contributor(s): Omer Preminger
Dates:

Based on the cross- and intra-linguistic distribution of Person Case Constraint (PCC) effects, this paper shows that there can be no agreement in ϕ-features (PERSON, NUMBER, GENDER/NOUN-CLASS) which systematically lacks a morpho-phonological footprint. That is, there is no such thing as “abstract” ϕ-agreement, null across the entire paradigm. Applying the same diagnostic to instances of clitic doubling, we see that these do involve syntactic agreement. This cannot be because clitic doubling is agreement; it behaves like movement (and unlike agreement) in a variety of respects. Nor can this be because clitic doubling, qua movement, is contingent on prior agreement—since the claim that all movement depends on prior agreement is demonstrably false. Clitic doubling requires prior agreement because it is an instance of non-local head movement, and movement of X0 to Y0 always requires a prior syntactic relationship between Y0 and XP. In local head movement (the kind that is already permitted under the Head Movement Constraint), this requirement is trivially satisfied by (c-)selection. But in non-local cases, agreement must fill this role.

Read More about What the PCC tells us about “abstract” agreement, head movement, and locality

Ellipsis in Transformational Grammar

Ellipsis is deletion.

Linguistics

Contributor(s): Howard Lasnik
Non-ARHU Contributor(s):

Kenshi Funakoshi (*14)

Dates:
Publisher: Oxford University Press

This chapter examines three themes concerning ellipsis that have been extensively discussed in transformational generative grammar: structure, recoverability, and licensing. It reviews arguments in favor of the analysis according to which the ellipsis site is syntactically fully represented, and compares the two variants of this analysis (the deletion analysis and the LF-copying analysis). It is concluded that the deletion analysis is superior to the LF-copying analysis. A discussion of recoverability follows, which concludes that in order for elided material to be recoverable, a semantic identity condition must be satisfied, but that is not a sufficient condition: syntactic or formal identity must be taken into account. The chapter finally considers licensing. It reviews some proposals in the literature about what properties of licensing heads and what local relation between the ellipsis site and the licensing head are relevant to ellipsis licensing.

Read More about Ellipsis in Transformational Grammar

Control complements in Mandarin Chinese: Implications for restructuring and the Chinese finiteness debate

Mandarin data suggest that restructuring with a Control verb is possible even when the verb's complement is a full size clause.

Linguistics

Non-ARHU Contributor(s):

Nick Huang (*19)

Dates:

Many proposals on restructuring suggest that restructuring phenomena are only observed when a control predicate takes as a complement a functional projection smaller than a clause. In this paper, I present novel Mandarin data against recent proposals that restructuring control predicates cannot take clausal complements and the related generalization that clausal complements always block restructuring phenomena. An alternative account of the Mandarin data is presented. The data also bear on the question of whether a finiteness distinction exists in Chinese. In particular, they provide clearer evidence that control predicates can take clausal complements that differ syntactically from those of non-control attitude predicates. This difference parallels the cross-linguistic correlation between control predicates and non-finite clausal complements and lends new support for the claim that Chinese makes a finiteness distinction.

Read More about Control complements in Mandarin Chinese: Implications for restructuring and the Chinese finiteness debate

The importance of input representations

Learning from data is not incompatible with approaches that attribute rich initial linguistic knowledge to the learner. On the contrary, such approaches must still account for how knowledge guides learners in using their data to infer a grammar.

Linguistics

Contributor(s): Jeffrey Lidz
Non-ARHU Contributor(s):

Laurel Perkins (*19)

Dates:

Language learners use the data in their environment in order to infer the grammatical system that produced that data. Yang (2018) makes the important point that this process requires integrating learners’ experiences with their current linguistic knowledge. A complete theory of language acquisition must explain how learners leverage their developing knowledge in order to draw further inferences on the basis of new data. As Yang and others have argued, the fact that input plays a role in learning is orthogonal to the question of whether language acquisition is primarily knowledge-driven or data-driven (J. A. Fodor, 1966; Lidz & Gagliardi, 2015; Lightfoot, 1991; Wexler & Culicover, 1980). Learning from data is not incompatible with approaches that attribute rich initial linguistic knowledge to the learner. On the contrary, such approaches must still account for how knowledge guides learners in using their data to infer a grammar.

The temporal dynamics of structure and content in sentence comprehension: Evidence from fMRI-constrained MEG

fMRI implicates the TPJ, PTL, ATL and IFG regions of the left hemisphere in the processing of linguistic structure. But what are the temporal dynamics of their involvement? This MEG study provides some initial answers.

Linguistics

Contributor(s): Ellen Lau
Non-ARHU Contributor(s): William Matchin, Chris Hammerly, Christian Brodbeck
Dates:
Humans have a striking capacity to combine words into sentences that express new meanings. Previous research has identified key brain regions involved in this capacity, but little is known about the time course of activity in these regions, as hemodynamic methods such as fMRI provide little insight into temporal dynamics of neural activation. We performed an MEG experiment to elucidate the temporal dynamics of structure and content processing within four brain regions implicated by fMRI data from the same experiment: the temporo-parietal junction (TPJ), the posterior temporal lobe (PTL), the anterior temporal lobe (ATL), and the anterior inferior frontal gyrus (IFG). The TPJ showed increased activity for both structure and content near the end of the sentence, consistent with a role in incremental interpretation of event semantics. The PTL, a region not often associated with core aspects of syntax, showed a strong early effect of structure, consistent with predictive parsing models, and both structural and semantic context effects on function words. These results provide converging evidence that the PTL plays an important role in lexicalized syntactic processing. The ATL and IFG, regions traditionally associated with syntax, showed minimal effects of sentence structure. The ATL, PTL and IFG all showed effects of semantic content: increased activation for real words relative to nonwords. Our fMRI-guided MEG investigation therefore helps identify syntactic and semantic aspects of sentence comprehension in the brain in both spatial and temporal dimensions.

Read More about The temporal dynamics of structure and content in sentence comprehension: Evidence from fMRI-constrained MEG

The labeling problem in syntactic bootstrapping: Main clause syntax in the acquisition of propositional attitude verbs

Children may use properties of a complement clause to make inferences about the meaning of the verb. But how abstract might these cues be, given how languages differ?

Linguistics

Non-ARHU Contributor(s):

Aaron Steven White (*15)

Dates:
Publisher: John Benjamins

In English, the distinction between belief verbs, such as think, and desire verbs, such as want, is tracked by tense found in the subordinate clauses of those verbs. This suggests that subordinate clause tense might be a useful cue for learning the meanings of these verbs via syntactic bootstrapping. However, the correlation between tense and the belief v. desire distinction is not cross-linguistically robust; yet the acquisition profile of these verbs is similar cross-linguistically. Our proposal in this chapter is that, instead of using concrete cues like subordinate clause tense, learners may utilize more abstract syntactic cues that must be tuned to the syntactic distinctions present in a particular language. We present computational modeling evidence supporting the viability of this proposal.