Publications by Year: 1991

1991

Pinker, S., & Levin, B. (1991). Lexical and Conceptual Semantics (1–). Cambridge, MA: MIT Press.

How are words represented in the mind and woven into sentences? How do children learn how to use words? Currently there is a tremendous resurgence of interest in lexical semantics. Word meanings have become increasingly important in linguistic theories because syntactic constructions are sensitive to the words they contain. In computational linguistics, new techniques are being applied to analyze words in texts, and machine-readable dictionaries are being used to build lexicons for natural language systems. These technologies provide large amounts of data and powerful data-analysis techniques to theoretical linguists, who can repay the favor to computer science by describing how one efficient lexical system, the human mind, represents word meanings. Lexical semantics provides crucial evidence to psychologists, too, about the innate stuff out of which concepts are made. Finally, it has become central to the study of child language acquisition. Infants are not born knowing a language, but they do have some understanding of the conceptual world that their parents describe in their speech. Since concepts are intimately tied to word meanings, knowledge of semantics might help children break into the rest of the language system. Lexical and Conceptual Semantics offers views from a variety of disciplines of these sophisticated new approaches to understanding the mental dictionary.

AVAILABLE AT:
Amazon
Amazon UK

See also: Books
Gropen, J., Pinker, S., Hollander, M., & Goldberg, R. (1991). Affectedness and Direct Objects: The role of Lexical Semantics in the Acquistion of Verb Argument Structure. Cognition, 41(1-3), 153-195.

How do speakers predict the syntax of a verb from its meaning? Traditional theories posit that syntactically relevant information about semantic arguments consists of a list of thematic roles like "agent", "theme", and "goal", which are linked onto a hierarchy of grammatical positions like subject, object and oblique object. For verbs involving motion, the entity caused to move is defined as the "theme" or "patient" and linked to the object. However, this fails for many common verbs, as in fill water into the glass and cover a sheet onto the bed. In more recent theories verbs’ meanings are multidimensional structures in which the motions, changes, and other events can be represented in separate but connected substructures; linking rules are sensitive to the position of an argument in a particular configuration. The verb’s object would be linked not to the moving entity but to the argument specified as "affected" or caused to change as the main event in the verb’s meaning. The change can either be one of location, resulting from motion in a particular manner, or of state, resulting from accommodating or reacting to a substance. For example, pour specifies how a substance moves (downward in a stream), so its substance argument is the object (pour the water/glass); fill specifies how a container changes (from not full to full), so its stationary container argument is the object (fill the glass/water). The newer theory was tested in three experiments. Children aged 3;4-9;4 and adults were taught made-up verbs, presented in a neutral syntactic context (this is mooping), referring to a transfer of items to a surface or container. Subjects were tested on their willingness to encode the moving items or the surface as the verb’s object. For verbs where the items moved in a particular manner (e.g., zig-zagging), people were more likely to express the moving items as the object; for verbs where the surface changed state (e.g., shape, color, or fullness), people were more likely to express the surface as the object. This confirms that speakers are not confined to labeling moving entities as "themes" or "patients" and linking them to the grammatical object; when a stationary entity undergoes a state change as the result of a motion, it can be represented as the main affected argument and thereby linked to the grammatical object instead.

Pinker, S. (1991). Rules of Language. Science, 253, 530-535.

Language and cognition have been explained as the products of a homogeneous associative memory structure or alternatively, of a set of genetically determined computational modules in which rules manipulate symbolic representations. Intensive study of one phenomenon of English grammar and how it is processed and acquired suggest that both theories are partly right. Regular verbs (walk-walked) are computed by a suffixation rule in a neural system for grammatical processing; irregular verbs (run-ran) are retrieved from an associative memory.

Prasada, S., & Pinker, S. (1991). Generalisation of regular and irregular morphological patterns. Language and Cognitive Processes, 8(1), 1-56.

When it comes to explaining English verbs’ patterns of regular and irregular generalization, single-network theories have difficulty with the former, rule-only theories with the latter process. Linguistic and psycholinguistic evidence, based on observation during experiments and simulations in morphological pattern generation, independently call for a hybrid of the two theories.