Skip to main content
Skip to main content

Paul Pietroski

Photo of Paul Pietroski

Professor Emeritus, Linguistics
Emeritus Professor, Philosophy


 

Research Expertise

Philosophy of Language
Semantics

Paul Pietroski (PhD, MIT) is Professor of Philosophy and Linguistics. His main research interests lie at the intersection of these fields. Recently, his work has focused on how grammatical structure is related to logical form, how meaning is related to truth, and how human concepts are related to linguistic understanding.

Publications

Individuals versus ensembles and "each" versus "every": Linguistic framing affects performance in a change detection task

More evidence that "every" but not "each" evokes ensemble representations.

Linguistics

Contributor(s): Jeffrey Lidz, Paul Pietroski
Non-ARHU Contributor(s): Tyler Knowlton *21, Justin Halberda, 
Dates:

Though each and every are both distributive universal quantifiers, a common theme in linguistic and psycholinguistic investigations into them has been that each is somehow more individualistic than every. We offer a novel explanation for this generalization: each has a first-order meaning which serves as an internalized instruction to cognition to build a thought that calls for representing the (restricted) domain as a series of individuals; by contrast, every has a second-order meaning which serves as an instruction to build a thought that calls for grouping the domain. In support of this view, we show that these distinct meanings invite the use of distinct verification strategies, using a novel paradigm. In two experiments, participants who had been asked to verify sentences like each/every circle is green were subsequently given a change detection task. Those who evaluated each-sentences were better able to detect the change, suggesting they encoded the individual circles' colors to a greater degree. Taken together with past work demonstrating that participants recall group properties after evaluating sentences with every better than after evaluating sentences with each, these results support the hypothesis that each and every call for treating the individuals that constitute their domain differently: as independent individuals (each) or as members of an ensemble collection (every). We situate our findings within a conception of linguistic meanings as instructions for thought building, on which the format of the resulting thought has consequences for how meanings interface with non-linguistic cognition.

Read More about Individuals versus ensembles and "each" versus "every": Linguistic framing affects performance in a change detection task

Psycholinguistic evidence for restricted quantification

Determiners express restricted quantifiers and not relations between sets.

Linguistics | Philosophy

Contributor(s): Jeffrey Lidz, Alexander Williams, Paul Pietroski
Non-ARHU Contributor(s): Tyler Knowlton *21, Justin Halberda (JHU)
Dates:

Quantificational determiners are often said to be devices for expressing relations. For example, the meaning of every is standardly described as the inclusion relation, with a sentence like every frog is green meaning roughly that the green things include the frogs. Here, we consider an older, non-relational alternative: determiners are tools for creating restricted quantifiers. On this view, determiners specify how many elements of a restricted domain (e.g., the frogs) satisfy a given condition (e.g., being green). One important difference concerns how the determiner treats its two grammatical arguments. On the relational view, the arguments are on a logical par as independent terms that specify the two relata. But on the restricted view, the arguments play distinct logical roles: specifying the limited domain versus supplying an additional condition on domain entities. We present psycholinguistic evidence suggesting that the restricted view better describes what speakers know when they know the meaning of a determiner. In particular, we find that when asked to evaluate sentences of the form every F is G, participants mentally group the Fs but not the Gs. Moreover, participants forego representing the group defined by the intersection of F and G. This tells against the idea that speakers understand every F is G as implying that the Fs bear relation (e.g., inclusion) to a second group.

Read More about Psycholinguistic evidence for restricted quantification

The mental representation of universal quantifers

On the psychological representations that give the meanings of "every" and "each".

Linguistics

Contributor(s): Jeffrey Lidz, Paul Pietroski
Non-ARHU Contributor(s): Tyler Knowlton *21, Justin Halberda (Hopkins)
Dates:
PhD student Tyler Knowlton smiling at the camera, surrounded by six members of his PhD committee, one joining remotely through an iPad

A sentence like every circle is blue might be understood in terms of individuals and their properties (e.g., for each thing that is a circle, it is blue) or in terms of a relation between groups (e.g., the blue things include the circles). Relatedly, theorists can specify the contents of universally quantified sentences in first-order or second-order terms. We offer new evidence that this logical first-order vs. second-order distinction corresponds to a psychologically robust individual vs. group distinction that has behavioral repercussions. Participants were shown displays of dots and asked to evaluate sentences with eachevery, or all combined with a predicate (e.g., big dot). We find that participants are better at estimating how many things the predicate applied to after evaluating sentences in which universal quantification is indicated with every or all, as opposed to each. We argue that every and all are understood in second-order terms that encourage group representation, while each is understood in first-order terms that encourage individual representation. Since the sentences that participants evaluate are truth-conditionally equivalent, our results also bear on questions concerning how meanings are related to truth-conditions.

Read More about The mental representation of universal quantifers

Linguistic meanings as cognitive instructions

"More" and "most" do not encode the same sorts of comparison.

Linguistics

Contributor(s): Tyler Knowlton, Paul Pietroski, Jeffrey Lidz
Non-ARHU Contributor(s): Tim Hunter *10 (UCLA), Alexis Wellwood *14 (USC), Darko Odic (University of British Columbia), Justin Halberda (Johns Hopkins University),
Dates:

Natural languages like English connect pronunciations with meanings. Linguistic pronunciations can be described in ways that relate them to our motor system (e.g., to the movement of our lips and tongue). But how do linguistic meanings relate to our nonlinguistic cognitive systems? As a case study, we defend an explicit proposal about the meaning of most by comparing it to the closely related more: whereas more expresses a comparison between two independent subsets, most expresses a subset–superset comparison. Six experiments with adults and children demonstrate that these subtle differences between their meanings influence how participants organize and interrogate their visual world. In otherwise identical situations, changing the word from most to more affects preferences for picture–sentence matching (experiments 1–2), scene creation (experiments 3–4), memory for visual features (experiment 5), and accuracy on speeded truth judgments (experiment 6). These effects support the idea that the meanings of more and most are mental representations that provide detailed instructions to conceptual systems.

Read More about Linguistic meanings as cognitive instructions

Interrogatives, Instructions, and I-languages: An I-Semantics for Questions

An internalist semantics for interrogative clauses, from Terje Lohndal and Paul Pietroski.

Linguistics

Contributor(s): Paul Pietroski
Non-ARHU Contributor(s): Terje Lohndal
Dates:
It is often said that the meaning of an interrogative sentence is a set of answers. This raises questions about how the meaning of an interrogative is compositionally determined, especially if one adopts an I-language perspective. By contrast, we argue that I-languages generate semantic instructions (SEMs) for how to assemble concepts of a special sort and then prepare these concepts for various uses - e.g., in declaring, querying, or assembling concepts of still further complexity. We connect this abstract conception of meaning to a specific (minimalist) conception of complementizer phrase edges, with special attention to wh-questions and their relative clause counterparts. The proposed syntax and semantics illustrates a more general conception of edges and their relation to the so-called duality of semantics.

Poverty of the Stimulus Revisited

Countering recent critiques, Paul Pietroski and collaborators defend the idea that some invariances in human languages reflect an innate human endowment, as opposed to common experience.

Linguistics

Contributor(s): Paul Pietroski
Non-ARHU Contributor(s): Robert Berwick, Beracah Yankama, Noam Chomsky
Dates:
A central goal of modern generative grammar has been to discover invariant properties of human languages that reflect 'the innate schematism of mind that is applied to the data of experience' and that 'might reasonably be attributed to the organism itself as its contribution to the task of the acquisition of knowledge'. Candidates for such invariances include the structure dependence of grammatical rules, and in particular, certain constraints on question formation. Various 'poverty of stimulus' (POS) arguments suggest that these invariances reflect an innate human endowment, as opposed to common experience: Such experience warrants selection of the grammars acquired only if humans assume, a priori, that selectable grammars respect substantive constraints. Recently, several researchers have tried to rebut these POS arguments. In response, we illustrate why POS arguments remain an important source of support for appeal to a priori structure-dependent constraints on the grammars that humans naturally acquire.

Meaning before truth

Linguistic semantics should be the study, not of reference and truth conditions, but of how the expresssions of a natural language constrain the contents of thoughts and communicative actions.

Linguistics

Contributor(s): Paul Pietroski
Dates:
Linguistic semantics should be the study, not of reference and truth conditions, but of how the expresssions of a natural language constrain the contents of thoughts and communicative actions.