My Academia.edu Page w/ Publications

31 Mar 2009

Turing Computation of the Natural Numbers


by Corry Shores
[Search Blog Here. Index-tags are found on the bottom of the left column.]

[Central Entry Directory]
[Computation Entry Directory]


Turing Computation
of the Natural Numbers


We will now build from our previous post on Turing machines. What we want is a mechanical computation of the natural numbers.

[I obtain the following program from Peter Bradley's wonderful animated site on Turing Machines.]

Our abstract machine will compute the natural numbers and display them in binary. So it is a "binary counter" machine.

This is its program, displayed as a "machine table." The S represents the number that the machine is reading. The subscript '0' means it would be a zero, and the subscript '1' means it is a one.



The q's stand for the two different instructions. L means to move left. R means to move right. There are two boxes for each q instruction, because the machine will see either of two symbols (1 or 0), and according to which one, it will perform some distinct task. We could articulate the instructions this way.

q1) If the scanned symbol is zero, write a zero overtop of it, move to the left, and begin instruction q1. If the scanned symbol is one, write a one overtop of it, move to the right, and begin instruction q2.
q2) If the scanned symbol is zero, write a one instead, move to the left, and begin instruction q1. If the scanned symbol is one, write a zero instead, move to the right, and begin instruction q2.


We will follow it through all the steps leading to its count of four.

It begins on a zero.



All the other boxes are zero except for one. At two spaces to the left of the starting point, there is a '1'. We do not read this with the number. Its function is to 'bump' the machine back to the right so it may continue its adding. All the boxes to the right of the bumper are the binary digit places that display each new counted number as it is added. However, this display shows them inversely. Hence we need to think of the ascending digit places moving to the right. So normally 100 equals four in binary. But here it would be displayed as 001.

So the machine finds itself at the second zero. We place a yellow arrow wherever the machine is found at the start of the given instruction. It always moves at the end of the instruction. We display its direction with black arrows.


We begin at the first instruction q1. Because the machine starts at a zero, we look to S0. It tells the machine to write a zero, and to move to the left. There is already a zero there, so nothing changes.


Likewise for the next step. Instruction q1 repeats.



But now the machine stands above a '1'.
It writes a '1' again, then it moves to right and begins instruction q2.



At this box there is a zero. So the machine replaces it with a '1', moves the left, and begins instruction q2. At this point, we have our first counted value: 1


The number it scans next is a '1', so it writes a '1' over it, moves to the right, and begins instruction q2.



Here the machine finds a '1'. So it replaces it with a zero, moves to the right to begin q2. But notice the similarity to when we clear the binary abacus.



At the next place it finds a zero. Instruction q2 tells it then to write a '1' instead and move to the left. This is equivalent to the carry procedure on the binary abacus.

So we see our first instance of clear and carry performed by the Turing Computer.


It now sees a zero. Instruction q1 tells it to write another zero, move to the left, and begin instruction q1.



It arrives upon the bumper. Instruction q1 tells it to write another '1' over it, move to the right, and begin instruction q2.



Now it sees a zero in the units place of our number display. Instruction q2 tells it to write a '1', move to the left, then begin instruction q1. In this way, we have added another number to get three.



It hits the bumper again, so it goes to the right and begins instruction q2.



Now it sees a '1' in the units place. Instruction q2 tells it to write a '0' instead, move to the right, and begin instruction q2. Here again is a clear procedure.



It sees another '1' in the next digit place. Instruction q2 tells it to write a zero, move to the right, and begin instruction q2. This is the second part of the clear/carry procedure.


Now it sees a '0'. Instruction q2 tells it to replace the '0' with a one, move the left, and begin instruction q1. This is the carry procedure that gets us to four. Now, the machine repeats the same clear/carry process as it had before.

Given an infinite amount of time, it will display the infinity of natural numbers.



See Peter Bradley's site on Turing Machines

Programing Abstract Machines: Turing Machines in Boolos & Jeffrey's Computability and Logic

by Corry Shores
[Search Blog Here. Index-tags are found on the bottom of the left column.]

[Central Entry Directory]
[Computation Entry Directory]


Turing Machines

Introduced by Boolos and Jeffrey


Our abstract machine is a man in a box. His cube travels along a railroad track.



The rail ties mark discrete squares along the track. The car may move along the track in either direction. We have rail workers on either end, always ready to add more track and ties as the box nears the edges.

Almost all the rail-ties have nothing but white ballast stones placed between them. But showing in a select few squares is one symbol from a finite set. They are configured using black ballast stones arranged against the background of the usual white ones.

The symbols in the square can be anything. And we will refer to them abstractly using variables, so to allow for every possibility. Hence we consider the finite number of symbols that may be found or placed in the squares S1, S2, ... Sn. The empty squares we will call S0. In our more concrete example, S0 will be a '0' in the square, and S1 will be a '1'.

The box's bottom is missing. So the man inside can read the symbols in each square. And he can change them or add a symbol to a blank square. The "poor mug" inside may also push or pull the box one place to the right or left. [below is a modification of the Boolos and Jefferey diagram. Image credits below.]


So the man performs his three actions: reading, writing, and moving. He does so according to a list of instructions. Any one instruction on the list has its own number i. He will be in a certain state at each stage of the computation process. What determines the state is the particular instruction he is performing. The total number of instructions is m. So we call the internal states q1, q2, ..., qm.
"He is in state qi when he is carrying out instruction number i."

Each instruction is formulated as a conditional. So it tells him what to do, depending on whether the symbol he reads is S0 or S1 or .... or Sn.

There are n + 4 things the boxed computer man can do. He can
1) Halt the computation.
2) Move one square to the right.
3) Move one square to the left.
4) Write S0 in place of whatever is in the scanned square.
5) Write S1 in place of whatever is in the scanned square.
.
.
.
n + 4) Write Sn in place of whatever is in the scanned square.

The instruction he carries-out is the state he is in. Depending on what that is, and on what symbol he is scanning, the man will perform one or some other of these n + 4 overt actions. And unless he is halted, he will also perform covert acts "in the privacy of his box." Namely, he will determine what will be the next instruction (the next state). "Thus, the present state and the presently scanned symbol determine what overt act is to be performed, and what the next state is to be." (22b)

There are a number of ways that we may specify the overall program of instructions the man is to follow. For example, we might use a machine table, a flow graph, or a set of quadruples. These three types of descriptions are illustrated below. This is how they appeared for a machine that writes three symbols S1 on a blank rail track, and then halts, scanning the leftmost of the three.



We will now look more closely at this example.

The man begins above an empty square (which we depict with a '0'). His instructions are, "if the square is empty, write symbol S1. And if there is a number S1 in the square, move one square to the left. Halt after writing three S1's"

More explicitly, when following instruction q1, the man is to
1) If the scanned symbol is S0, write an S1 in the initial square and repeat instruction q1. But
2) If the scanned symbol is S1, move left and following instruction q2 next.

So we will represent the instructions according to the three methods.

The computer box begins above a zero.



His initial part of his first instruction is a conditional. If it is '0', make it '1.' Then repeat instruction q1.

First the machine sees a zero. So it enters a '1'. Then it returns to the beginning of this instruction.


This first conditional of the initial instruction is displayed these ways for the different formats. In the machine table, we begin with q1. If it sees symbol So, he is to write an S1. The table displays the 'q1' in S1q1 to indicate that we go back to the beginning of the instruction.


In the flow chart, the circular arrow indicates the return to the same instruction.



And in the quadruples, the repeated q1 indicates the return to the instruction's beginning.



So the computer box returns to the beginning of q1. At the beginning, it says that if it reads S0, to write S1. But it does not read S0. The next part of the instruction says that if the symbol is S1, then move to the left and begin instruction q2. Thus our machine moves left.



We use an L to indicate the move-left operation in our three ways to show the program.







Instruction q2 is the same as q1, only one recurrence further in the process. Our computer sees a zero, so it enters a 1 and returns to the beginning of q2.



And the notations follow as before.







Now that it sees the 1 it has written, it moves left to begin q3.








It now sees a zero, so it writes a 1. But the instructions end here. Each of our displays of the program cease giving the man more tasks to perform.











Boolos, George, and Richard Jeffrey. Computability and Logic. Cambridge: Cambridge University Press, 1989.


Modified train track image from:


The Abstract Machines of Computation Theory. Marvin Minsky's Computation, Introduction and Finite-State Machines


by Corry Shores
[Search Blog Here. Index-tags are found on the bottom of the left column.]

[Central Entry Directory]
[Computation Entry Directory]
Marvin Minsky

Computation:
Finite and Infinite Machines

Chapter 1 Physical and Abstract Machines

1.2 Machines as Physical Models of Abstract Processes

Someone describes a machine to us. We are not initially interested in how it is constructed. We first want to know how the machine works in principle. How does it work?

The idea of a machine usually centers around a some abstract model or process. (5c)

Consider a theory in physics. Perhaps material circumstances do not support the theory. In that case we criticize the theory, not the material world. However, imagine we design and build a machine. Soon it breaks down. So we fix it. Unlike in physics, we do not criticize the theory. Instead we find fault with the system's material parts. (5-6)

[In other words, we are designing, building, testing and evaluating abstract machines for computational tasks.]


Part I Finite State Machines

Chapter II Finite-State Machines

2.0 Introduction

We may also call finite state machines, 'finite automata.' The operations of these 'highly idealized' machines "proceed in clearly separate 'discrete' steps from one to another of a finite number of configurations or states. (11b)

Because they are finite, we may completely describe their structure and behavior without any ambiguity or approximation.
It is much harder to deal with more realistic models of mechanical systems, in which variable quantities like time, position, momentum, friction, etc., vary smoothly over continuous, imperceptibly changing ranges of values. (11c)
[These would be the natural machines that compute analogically. We later will discuss as examples Pythagoras' monochord and golden ratio.]

Nonetheless, our digital systems are still quite powerful, and in many ways they can approximate anything that an analog physical system can accomplish. [We later address also the issue of approximation in terms of the analog/digital argument from "noise."]


2.0.1 Moments of Time

The finite machine's operations occur throughout a linear sequence of discrete time moments, "between which nothing happens." (12c)
One may imagine these moments as occurring regularly like the ticking of a clock, and we identify the moments with integers -- the variable for time, t, takes on only the values 0, 1, 2, . .
(12d)


2.1 States and Signals

From the point of view of its user or environment, the machine may be regarded as a "closed box with input and output channels." Every now and then, the user acts upon the machine through the input channels. And at other times the machine acts upon the user through the output channels. Unless we want to know how things work on the inside of the box, we do not need to know what takes place inside it. Hence we call it a "black box."



We are concerned with the set of distinguishable states that characterize a channel at any given moment. Each time-point is discrete. One of a finite number of possible states or conditions corresponds to each discrete moment in time.

We might symbolize the state using alphabet letters, numbers, or such words as "yes" and "no." In our examples, we consider our electric switch to be in either an "on" or an "off" position. So our finite state machine theory is not concerned with continuous or analog devices "which can be set to transmit any of an infinite set of fractions of a signal."

We want to know how our machine behaves. So we need to determine how its outputs depend on its inputs. Then we could know the output for some certain specified time.
One restriction we will introduce here is this -- the output state at the moment t doesnot depend on the input state at that same time t. This is to say, that we won't admit the possibility of signals traversing a unit instantaneously. But the output at timet + 1 will usually depend, at least in part, on the input signal at the previous momentt. This restriction reflects the physical impossibility of instantaneous transmission from one place to another. It is necessary here for a simpler reason: to prevent unrealizable -- paradoxical -- descriptions of situations when channels are interconnected, and to prevent infinitely fast behavior -- also physically unrealizable. (14 c.d, my emphasis on instantaneous)
[Soon we will examine natural computers and the way they instantaneously calculate infinite computations.]


Minsky, Marvin. Computation: Finite and Infinite Machines. London: Prentice-Hall International, Inc., 1972.

Stoic Logic, Mates, Chapter 1, §6 Comparison with Modern Theories (§2)



by Corry Shores
[Search Blog Here. Index-tags are found on the bottom of the left column.]

[Central Entry Directory]
[Stoics, Stoic Logic, Stoic Semantics, Entry Directory]




Benson Mates

Stoic Logic

Chapter II: Signs, Sense, and Denotation

§6 Comparison with Modern Theories (§2)





We will compare Stoic semantics with Frege's and Carnap's.

Mates provides a table that classifies each one's semantic distinctions. [Click on image for enlargement]



He explains that the Stoic theory bears two serious deficiencies compared with Frege's and Carnap's theories.
1) there is no principle of interchangeability or a discussion of problems related to it, and
2) there are no Stoic discussions of 'oblique' or 'not purely disignative' expressions.

The Stoics make three fundamental semantic distinctions.
a) το σημαίνον (to semainon)
b) το λεκτόν (to lekton), and
c) το τυγχάνον (to tugchanon)

These correspond in many ways to Frege's
a) Zeichen
b) Sinn, and
c) Bedeutung

And to Carnap's
a) designator
b) intension
c) extension.

The Lekton is what the sign designates or means. It is also what we grasp as existing in close connection with our intellect. As we noted before, the Lekton is what the Barbarians do not understand when they hear Greek words spoken. (22a.b)

According to Carnap, Sinn and intension "refer to meaning in a strict sense, as that which is grasped when we understand an expression without knowing the facts." (Carnap qt in Mates 22b)

For Frege, the sense (Sinn) of a sign is "the manner in which that which is denoted by the sign is given." (qt 22b) For example, we let A, B, C be the medians of a triangle. [Image credits provided below]



We see that lines A, B, and C meet at a point in the triangle's center. So we could refer to the "intersection of A and B." And that would denote the same point as if we referred-to "the intersection of B and C." Nonetheless, the two expressions do not have the same sense.

Or also consider how for long stretches of time, Venus appears at night before dawn. While during the other stretch of time, it appears after dusk.


For long, human cultures thought they were two different 'stars.' So the 'evening star' was called Hesperus or Vesper. And the "morning star' was called Phosphorus or Lucifer. Yet both Hesperus and Phosphorus denoted the same planet.

It is interesting to see why this phenomenon occurs.

Venus' orbit is closer to the sun.



And let's say from our cosmic perspective, the earth is rotating this direction.



And we mark our position on the earth with an 'x.' As the earth rotates through the night, the sun comes closer to the horizon. Just before dawn, Venus rises into the sky.



But every 584 days, Venus will cross in front of us, and pass-over to the sun's other side. Then, it will no longer show before dawn. However, right after dusk, it will be one of the first and brightest heavenly bodies in the early night sky.



Whole mythologies were developed around the distinct characters of Vesper and Lucifer. As Vesper bells ring, some make Vesper prayers, calling for God's assistance, as the evening star rises to be seen. And Lucifer was once the prince of light. To punish his heavenly rebellion, God casts him to hell. At dawn we see Lucifer's light be overcome by the sun.

Clearly the terms "morning star" and "evening star" have full rich meanings that can be distinguished from each other. But they both refer-to just one planet. The extensional meaning (Bedeutung) is Venus. Both expressions "morning star" and "evening star" extend out to Venus, the object in extended space. Yet, each one has a meaning internal to it. Their sense (Sinn) does not extend outward to extended objects. Rather it is an inner tendency to call upon unextended images, notions, emotions, and so forth. There are warring inner tensions of meaning. This is an intension, that is, an intense sense.

Frege also distinguishes the idea (Vorstellung) from the sense (Sinn). The idea is subjective and private. However, the sense is objective and public. Likewise, the Stoics distinguish the Lekton from the presentation (φαντασία phantasia). The Lekton is the content of a rational presentation. In this presentation, discourse may present the φαντασθέν (phantasthen). Hence, we may call the Lekton the presentation's "objective content" (το φαντασθέν). However, Frege considers sense to be the "objective content" of the Vorstellung. For Frege, sense is 'between' the subjective idea and the denoted object, much like Ammonius claim that the Lekton is between the thought and the thing. '(22c)

To make our comparisons more precise, we need to know if the Lekton of every expression is the same as its sense. So we turn to the further distinctions in rows b to e of our table.

'Signs' B: Individual Name
Consider the name "Aristotle." The Lekton corresponding to such an individual name is its characteristic properties. So for the name 'Aristotle,' the Lekton is the property of being Plato's student and Alexander the Great's teacher. Frege's understanding of sense is very similar, but he does not specify that the properties need to be peculiar to the individual. For him, the sense of 'Aristotle' is 'the student of Plato and the teacher of Alexander.' (23a.b)

'Sense' B: Individual Concept
For Carnap, an individual expression's intension is its 'individual concept.' This corresponds to Frege's Sinn. So like Frege, he does not agree with the Stoics that the intension must be peculiar to the individual. However, the Stoics, Carnap, and Frege all regard the intension as being properties.

Denotation B: Extension
Carnap uses the term 'individual' for the extension of an individual expression. However, neither Frege nor the Stoics specify such a category. Note a superiority of the modern view over the ancient.
The Stoics asserted flatly, in accordance with their materialism, that the objects denoted by all expressions are bodies, just as the signs are bodies. (23c, emphasis mine)
However, Frege and Carnap do not articulate such a metaphysics. Yet, Frege resists the skeptics and idealists who claim that no expression has a denotation. For, we
1) "usually intend to talk about something more than our own ideas"
2) assume that our expressions have denotations, and
3) presume that "our intention is justification enough for introducing the concept of denotation." (24a)

However, this is not a metaphysical argument.

'Signs' C: Class Names
For both Carnap and the Stoics, the intensions of class names are properties of its members. Yet, the only Stoic examples we have for class names are "man, "horse," "goddess," and "wrath." These seem more to be species, because "manhood" for example might more properly be considered a genus. And Frege does not refer to class names specifically.

D: Predicates
The Stoic's ρήμα (rema) corresponds well to Carnap's "predicate." In Stoic semantics, a predicate combines with a subject to form a proposition. Frege could have used the term Prädikat. Hence, the intension of a predicate is the property corresponding to the class name.

E: Sentences
The Stoics, Frege, and Carnap all agree that the intension of a sentence is a proposition. But unlike Frege and Carnap, the Stoics never say that a sentence's extension is its truth value.

Conclusion
We see then that there is remarkable similarity between the Stoic and modern semantical views in regard to the intension of various linguistic expressions. All three theories
1) agree-on the sorts of entities that serve as intentions for the different kinds of expressions, and
2) hold that "the intension of a part of a sentence is a part of the intension of the sentence." (26b)



[Entry directory for other entries in this series.]

From:

Mates, Benson. Stoic Logic. Berkeley: University of California Press, 1973. [Originally published in 1953 as Volume 26 of the University of California Publications in Philosophy.]

Image of Venus in night sky:

Triangle image: