Howard Lasnik
Professor Emeritus, Linguistics
Member, Maryland Language Science Center
Research Expertise
Syntax
Publications
Structure
"Reconstructing Structure" delves into how natural and linguistic phenomena are organized with quasi-periodic patterns, exploring the implications for understanding human language.
Natural phenomena, including human language, are not just series of events but are organized quasi-periodically; sentences have structure, and that structure matters.
Howard Lasnik and Juan Uriagereka “were there” when generative grammar was being developed into the Minimalist Program. In this presentation of the universal aspects of human language as a cognitive phenomenon, they rationally reconstruct syntactic structure. In the process, they touch upon structure dependency and its consequences for learnability, nuanced arguments (including global ones) for structure presupposed in standard linguistic analyses, and a formalism to capture long-range correlations. For practitioners, the authors assess whether “all we need is Merge,” while for outsiders, they summarize what needs to be covered when attempting to have structure “emerge.”
Reconstructing the essential history of what is at stake when arguing for sentence scaffolding, the authors cover a range of larger issues, from the traditional computational notion of structure (the strong generative capacity of a system) and how far down into words it reaches to whether its variants, as evident across the world's languages, can arise from non-generative systems. While their perspective stems from Noam Chomsky's work, it does so critically, separating rhetoric from results. They consider what they do to be empirical, with the formalism being only a tool to guide their research (of course, they want sharp tools that can be falsified and have predictive power). Reaching out to skeptics, they invite potential collaborations that could arise from mutual examination of one another's work, as they attempt to establish a dialogue beyond generative grammar.
Structure. Concepts, Consequences, Interactions
Natural phenomena, including human language, are not just series of events but are organized quasi-periodically; sentences have structure, and that structure matters.
Howard Lasnik and Juan Uriagereka “were there” when generative grammar was being developed into the Minimalist Program. In this presentation of the universal aspects of human language as a cognitive phenomenon, they rationally reconstruct syntactic structure. In the process, they touch upon structure dependency and its consequences for learnability, nuanced arguments (including global ones) for structure presupposed in standard linguistic analyses, and a formalism to capture long-range correlations. For practitioners, the authors assess whether “all we need is Merge,” while for outsiders, they summarize what needs to be covered when attempting to have structure “emerge.”
Reconstructing the essential history of what is at stake when arguing for sentence scaffolding, the authors cover a range of larger issues, from the traditional computational notion of structure (the strong generative capacity of a system) and how far down into words it reaches, to whether its variants, as evident across the world's languages, can arise from non-generative systems. While their perspective stems from Noam Chomsky's work, it does so critically, separating rhetoric from results. They consider what they do to be empirical, with the formalism being only a tool to guide their research (of course, they want sharp tools that can be falsified and have predictive power). Reaching out to sceptics, they invite potential collaborations that could arise from mutual examination of one another's work, as they attempt to establish a dialogue beyond generative grammar.
Read More about Structure. Concepts, Consequences, Interactions
Ellipsis in Transformational Grammar
Ellipsis is deletion.
This chapter examines three themes concerning ellipsis that have been extensively discussed in transformational generative grammar: structure, recoverability, and licensing. It reviews arguments in favor of the analysis according to which the ellipsis site is syntactically fully represented, and compares the two variants of this analysis (the deletion analysis and the LF-copying analysis). It is concluded that the deletion analysis is superior to the LF-copying analysis. A discussion of recoverability follows, which concludes that in order for elided material to be recoverable, a semantic identity condition must be satisfied, but that is not a sufficient condition: syntactic or formal identity must be taken into account. The chapter finally considers licensing. It reviews some proposals in the literature about what properties of licensing heads and what local relation between the ellipsis site and the licensing head are relevant to ellipsis licensing.