My Academia.edu Page w/ Publications

26 Feb 2009

Mathematical Methods in Linguistics, Partee, Meulen, and Wall, Ch 13, Basic Concepts



by
Corry Shores
[Search Blog Here. Index-tags are found on the bottom of the left column.]

[Central Entry Directory]
[Logic & Semantics, Entry Directory]


Partee, Meulen, and Wall

Mathematical Methods in Linguistics

Chapter XIII: Basic Concepts



In first order logic there are syntactical rules and semantical interpretations. And there is a relation between these two facets of first order logic. But such a formal language as logic is different from a natural language like English. So it might at first seem that the relation between syntax and semantics is different for English and logic. Yet, Richard Montague raised controversy when he became the first to seriously propose and defend the thesis that in fact this syntax/semantics relation is not very different for natural and formal languages.

Montague built from Frege's principle of compositionality, which we will now examine.

13.1 Compositionality

There are many fields that use the term "semantics" each in their own way. In all cases, semantics is concerned with "meaning". But in no case is "meaning" very clearly defined.

In logic, however, there is a more precise meaning for the term "semantics." There was a Western tradition that used it in a specific way. This tradition is reflected in Kalish's article in theEncyclopedia of Philosophy. Montague's work emerges from this tradition.

In formalized logical systems, semantic interpretations are kept unambiguous and tightly related to syntax. We use Frege's principle of compositionality to articulate this correspondence between a formula's syntactic structure and its semantic interpretation.

Consider the sentence, "the tree is green." This expression has parts: "the tree," "is," and "green." Each part has a semantic meaning. We know what a tree is, and what green is. And, we know that the "is" means that we are predicating the quality of being green to the tree. So this would be a syntactic rule. Both the semantical meanings in conjunction with the syntactical rules allow us to build-up compositionally the meaning of the expression.

The Principle of Compositionality: The meaning of a complex expression is a function of the meanings of its parts and of the syntactic rules by which they are combined. (318bc)

The above definition seems simple enough. But Montague involves it in sophisticated complexities.

We might try to specify the parts and workings of a formal language like first-order logic. When doing so, we can also satisfy the compositionality principle. We do so by
1) we give the syntax through a recursive specification.
2) our recursive specification begins with a stipulation for what qualifies as basic expressions for given categories in the language,
3) there are recursive rules for forming acceptable expressions. These rules stipulate a) that there are basic forms and b) that there are syntactical operations that build-up more complex forms.

We will use Greek letters to stand for formulations that fit the standardized format for the formal language. What the following definition will state is that we have basic formulations. And they may be combined by such logical operators as conjunction, disjunction, and implication. This forms new expressions. Then, these new expressions maybe be recombined by means of the same logical operators. This produces even more complex expressions that then may again be combined, and so on to infinity.

The logical operation we may further consider abstractly as a function. So we take both simple expressions. Let's say we have "A is B" and "C is D". Now let's make this simpler, and give the symbol α for "A is B". And then we will give the symbol β for "C is D". So we have sentences α and β. Now we will conjoin the two. This is our logical operation: conjunction. We will take both sentences as units, each with their own meaning and logical value, but we will take them together. And together they have a new meaning and value. We will call their conjunction together γ. Now, to produce γ, we will treat α and β as "input" into an operation. So let's render them as raw input, and write them like this: (α, β). Now, we take these units as input, and then obtain a different output, γ. Our operation is conjunction in our example, but it could be any operation whatsoever. Now, we call this operation a "function." In math, we might consider a function: f (x) = 3x. So when we input "2", we multiply 2 by 3 to get 6. It is similar for logic. So in our example, when you take α and conjoin it with β, you get something that says, α and β. We will use ^ to mean conjunction. So we get α ^ β. So we will call "F" any such function (need not be conjunction). In our specific symbolization in this text, we say "Fi". Now we place that function symbol outside the input values: Fi(α, β). And we took conjunction as our example, so Fi (α, β) becomes α ^ β, when the function is conjunction.

So, we perform a function whenever we use an operation to take a determinate input and give a determinate output (based on a pre-established convention or rule for determining that output).

Syntactic Rule n: If α is a well-formed expression of category A, and β is a well-formed expression of category B, then γ is a well-formed expression of category C, where γ = Fi(α, β).


This allows us to build-up the structures of our expressions. But it does not yet tell us how to interpret the symbols. For that we will need a semantic rule. Montague's method creates a parallel recursive rule that specifies how we will semantically interpret the syntactically related symbols.

This following rule will first look at the basic unit expressions. These are sentences we called α and β. Now, we already indicated how we will interpret them. α we interpret as "A is B". And β we interpret as "C is D". But that does not mean we should say that α = "A is B". For, we want to keep the syntactical symbols distinct from their semantic interpretations. So we will say that "A is B" = α'. And "C is D" = β'. Now we can deal with both α and its interpretation, by calling them α and α'.

Our semantic rule will say that
1) we will interpret α as α'
2) we will interpret β as β'
3) we know that Fi(α ,β) equals γ.
4) we would expect that the interpretation for γ would be γ'.
5) we took Fi to be ^. This was just an example. But we interpret the symbol ^ to mean conjunction or "and". So "and" is the semantic interpretation for ^. So "F" is the symbol for the uninterpreted functional operation. Let's call the symbol "G" the interpreted function, in our example, "and." So when we are using "G" we are also dealing with such interpreted expressions as α' and β'.

So our rule will want to say that we know that syntactically we get γ from performing function F on α and β. But what do we get when we perform interpreted function G on interpreted inputs α' and β'? We should expect to get the interpreted form of γ. For, γ results from the interpreted operation on α and β. Therefore, we should expect the interpreted form of γ to result from the interpreted operation on the interpreted inputs α and β.

Semantic Rule: If α is interpreted as α' and β is interpreted as β', the γ is interpreted as Gk(α', β').



From:
Partee, Meulen, and Wall. Mathematical Methods in Linguistics. Springer, 1990.
More information at:



No comments:

Post a Comment