Semantics - 4th Edition: Chapter 1 - Semantics in Linguistics

Download as odt, pdf, or txt
Download as odt, pdf, or txt
You are on page 1of 4

Semantics – 4th edition

Chapter 1 – Semantics in Linguistics

1.1 Introduction
Semantics is the study of meaning in language. The basic assumption with which we begin is that a
person’s linguistic abilities are based on knowledge that they have. This knowledge, according to
modern linguistics, is divided on several types, which match the levels of linguistic analysis:
phonological, (morpho-)syntactic and semantic.
Semantic knowledge allows speakers to know which sentences contradict each other, which ones
describe the same situation, are ambiguous or “entail” each other.

Entailment: relationship between sentences so that if a sentence A entails a sentence B, knowing A, we


automatically know B. It’s therefore impossible to assert A and deny B.

Due to the broadness of the field, semantics is the most diverse field within linguistics. Semanticists
need to also be acquainted with other disciplines such as psychology and philosophy, since these
also investigate the creation and transmission of meaning

1.2 Semantics and Semiotics


Semantics studies how people communicate meaning through language, which belongs to a larger
enterprise: understanding how people understand meaning. Linguistic meaning is a subset of the
human ability to use signs. The word mean has several uses: whether it be to denote relationships
between signifier and signified that are classified as indexes, icons or symbols.

1.3 Three Challenges in Doing Semantics


It’s always challenging to analyze a speaker’s semantic knowledge, and we can see it by adopting
the definitions theory.
Definitions theory: states that the meaning of words is what makes the meaning of bigger linguistic
expressions.

Giving definitions to words leads to certain problems, three of which are particularly note-worthy
• Circularity: we can only state the meaning of a word through other words. If the definitions
of word meaning are given in words, can we ever step outside language so as to describe it,
or will we always make use of circular definitions?
• Distinction between linguistic knowledge and encyclopedic knowledge: word meaning is
a kind of knowledge, and it seems to intersect with our knowledge of how the world is.
People can still understand each other even when their encyclopedic knowledge differs
(slightly) from the other’s. Furthermore, if the understanding of the meaning of a word
differs among speakers, how do we decide whose knowledge will be picked as our
meaning?
• Contribution of context to meaning: if features of context are part of an utterance’s
meaning, there is no way to include them in our definitions.
These three difficulties show us that the definitions theory is too simple to describe the phenomena
we need to investigate. Various theories have sought to give a solution to these problems.

1.4 Meeting the Challenges


Some possible strategies to deal with the problems outlined above include designing a semantic
metalanguage to describe the semantic units and rules of all languages, thus solving the problem of
circularity. An ideal metalanguage is neutral to all natural languages and should fulfill the criteria of
clarity, economy, consistency, etc. There exist several proposals of such a metalanguage, as well as
claims of its unattainability.
However, some linguists say that even a perfect metalanguage would not be a complete semantic
description, because the semantics of words should be described based on something non-linguistic.
This raises the question of whether words signify real objects or mere thoughts (chapter 2).
Setting up a metalanguage can also help with the problem of linguistic vs encyclopedic knowledge,
since this would involve a question of which elements of knowledge should be included in the
definition of a word. The issue here would be how much knowledge it is necessary to have so as to
use a word.
The problem of context is traditionally solved by differentiating between the conventional or literal
meaning of an expression and the local contextual effects; however, isolating the meaning of a word
entirely from any context is not easy. On the other hand, the study of context in communication
corresponds with the field of study of pragmatics, which investigates how speakers and hearers
amalgamate knowledge of context with linguistic knowledge.

1.5 Semantics in a Model of Grammar


1.5.1 Introduction
Semantics usually aims to become a component of grammar that parallels the others. But what kind
of module is semantics? The answer varies depending to the theory. All linguistic levels serve to
communicate meaning: phonemes, verb endings and word orders conveys differences of meaning.
This leads some writers to believe that meaning cannot be a separate level, but one directly related
to other levels of grammar.
The theory know as Cognitive Grammar asserts that this is the case, and that there should be no
strict separation of syntax, morphology and lexicon. It goes on to say, then, that it’s not possible to
separate linguistic knowledge from extra-linguistic knowledge. Notwithstanding, many other
linguists support the idea of maintaining the distinction between linguistic and non-linguistic
knowledge and between the modules of linguistic studies.

1.5.2 Word meaning and sentence meanings


Some linguists give the name of lexicon to the mental store of the words of a language known by a
speaker, and compare it to dictionaries. The lexicon is a large (and dynamic) body of knowledge,
part of which is semantic.
An important difference between word meaning and sentence and phrase meaning is productivity:
it’s much more common for new phrases (that the speaker has never produced or heard before) to be
put together than for new words to be created. A central insight of generative grammar is that a
small number of combinatory rules and a finite set of words can create a huge, if not infinite,
number of sentences. Hence, the rules of sentence formation must be recursive, allowing repetitive
embedding and coordination of syntactic categories: one can always add another clause to a
sentence or another nominal within a nominal.
In this sense, if these novel sentences created by speakers are understood, then they must obey the
semantic rules of a language. The meanings of sentences cannot be listed in a lexicon like those of
words, so they are created by rules of combination too. This is often described by saying that
sentence meaning is compositional, which means that the meaning of an expression depends on its
parts and the way in which they are combined.
Thus, we can find meaning in two places in a model of grammar: a stable one in the body of word
meanings, and the composed meanings of sentences. How do we connect the two? On can conclude
that semantic rules are also compositional and should be somehow related to grammatical rules. The
relationship differs from theory to theory: Chomskian theory of generative grammar claims that
syntax and semantics work separately and are brought together at a level of Logical Form, whereas
in other theories, semantic and grammatical rules are bound together, so combinations of words
need to be both semantically and syntactically permissible.

1.6 Some Important Assumptions


1.6.1 Reference and sense
According to Saussure, the meaning of linguistic expressions derives from two sources: the
language they belong to and that which they describe. The relationship by which a language relates
to the world is called reference. But this isn’t all that matters, since words also have a value
depending on their position within the vocabulary system (compare ‘fish’ and ‘pez’), and this is an
aspect of their sense. The meaning of a word is partly defined by the other terms in its semantic
field: that which it is and that which it isn’t. Something similar happens to grammatical structures
across languages, say, plurals that mean “two or more” versus those languages that have dual
plurals, so that there exists a plural for “three or more”.

1.6.2 Utterances, sentences, and propositions


Utterance: is created by speaking or writing a piece of language. The same thing can be said by
two different people, and they will count as two utterances.
Sentences: they are abstract grammatical elements obtained from utterances. Two people who use
the same word order to say the same thing will be saying only one sentence. Sentences are
abstracted from actual language use. There are kind of information that are relevant in terms of
utterances, but not in terms of sentences (pitch, intonation, accents…), and therefore we often
discard them when we, for example, directly quote someone.
Propositions: certain elements of grammatical information are irrelevant from a logical
perspective, for instance, active versus passive. The grammatical differences between these is never
relevant in a chain of reasoning and can be ignored. The information structure is therefore
relevant in terms of sentences, but as long as the content is logically the same, the sentences
correspond to only one proposition.
Propositions might be represented in several ways. For example, using capital letters, or through
formulae in which the verb (usually in infinitive) is viewed as the function and the subject and
objects as arguments.

1.6.3 Literal and non-literal meaning


Although the distinction is apparently simple, defining it can prove very challenging. It is difficult
to draw a line between both types of meaning. For one thing, speakers shift the meanings of words
to fit new conditions, for example, by metaphorical extension. After a while of being coined, new
expressions become fossilized and their metaphorical value becomes less transparent to speakers
(thus being considered dead metaphors). Thus, it is not easy to decide when a word is being used
literally or figuratively. This has lead some linguists to claim that there is no principled distinction
between literal and metaphorical uses of language. They see metaphors as a basic way of organizing
human thoughts about the world. Lakoff and Johnson identify clusters of metaphoric uses by using
labels (e.g. “Time is money”).
Some semanticists talk of faded or dead metaphors, and with this approach, there is a valid
distinction between literal and non-literal language. In what we can call the literal language
theory, metaphors and other rhetorical uses of language require different processing than literal
language: hearers recognize an expression as semantically odd, but then are motivated to give them
some interpretation by assuming that the speaker means something by it. In other words, the hearer
first discards the literal interpretation and then proceeds to work out what is actually meant. We will
see that some writers in cognitive semantics (such as Lakoff) find the literal language theory to be
wrong in that they conceive metaphors as something different from ordinary literal language.

1.6.4 Semantics and pragmatics


These terms denote related and complementary fields of study concerning the transmission of
meaning through language. However, the line between them is not easily drawn. For now, we will
make use of an early use of the term pragmatics in Charles Morris’s division of semiotics:

Syntax: formal relations of signs to each other


Semantics: relations of signs to the objects to which the signs are applicable
Pragmatics: the relation of signs to interpreters
Thus, from this we might interpret that it is only semantics when meaning is abstracted away from
users. One way of talking about this distinction is to distinguish between sentence meaning and
speaker meaning, so that we assume that words and sentences have meaning that is independent of
use, to which the speaker later incorporates the particular meaning in said context.
By making this distinction we could be freeing the semanticist form including all kinds of
knowledge in semantics, since investigating the relationship between encyclopedic and linguistic
knowledge would fall under the category of pragmatics. But when we get into the details, it is still
complicated to determine which phenomena are semantic and which are pragmatic, and this is
something on which there is wide disagreement among linguists. What is clear is that it is very
difficult to strip context from language, since sentences are designed by speakers to be uttered
within a context and with desired effects.

You might also like