Writing by Peter Hilton

What numbers are not

A philosophy of mathematics essay about the problems of trying to define numbers rigorously.

Abstract

A discussion of the essay 'What Numbers Could Not Be', by Paul Benacerraf, from Philosophy Of Mathematics, Hilary & Putnam (1964).

Benacerraf's essay, as its title hints, never claims to explain exactly what the numbers are. Rather, it concentrates on exposing the limitations of various accounts of the numbers, i.e. explaining what the numbers are not. In particular Benacerraf concludes that numbers are not sets, as most mathematicians believe. In the conclusion, however, when he presents his own viewpoint on the issue, we see that he takes a far stronger view. He claims that the numbers do not exist at all. Largely, the view that numbers do not exist is a sound one as not only does it avoid the difficulties of the different theories presented earlier on, but he is able to offer some justification for it. Moreover, this view is mathematically acceptable since mathematics does not demand of numbers that they be some particular objects but merely that some structure exists. It would possible to conclude that since numbers are a purely mathematical concern then the view that numbers do not exist as individuals must be philosophically acceptable as well.

Introduction

Philosopher: Can you tell me what the number three really is?
Mathematician: It's the number that comes after two of course!
Philosopher: So what's two then?
Mathematician: Well, two is what comes after one and zero.
Philosopher: Aha, so what about zero?
Mathematician: Oh that's easy, zero is just the smallest number

This essay is concerned with considering some of the points raised in Benacerraf's essay in which he explores the natural question that follows from the supposition that numbers exist, viz. what things are they? The motivation is that if we are to talk about numbers and use them in mathematics then we would like to have a clear understanding of what they are. If all of mathematics is to be reducible to the theory of sets, then we wish to know where the numbers are to fit in. What we seek is a definition of exactly what each number is so that we can be as sure about what is meant by 'the number three' as we are about 'the empty set'. Note that I always mean the natural numbers when I refer to 'numbers'. I am not concerned with what it means for numbers, or any other mathematical abstraction for that matter, to exist.

Benacerraf's essay is structured into three sections: the first is an exposition of some of the current accounts of number; the second section discusses the particular difficulties associated with such accounts, concluding that they are fundamentally wrong, and the third section is an attempt to justify the conclusion of the previous section. After a brief introduction to the issue I will present some of Benacerraf's own views, as he expresses them in his concluding section, and explore what he says.

First of all, Benacerraf presents the idea that the natural numbers can be defined as an infinite set with an ordering relation, such as <, and that this mathematical definition accounts exactly for the common notion of what we mean by 'numbers'. The implicit claim made by this kind of account is that if our intuitive understanding of numbers is accounted for by such a theory, then the latter must be of some importance. Benacerraf is claiming no such thing himself here: although it does not become apparent until later on, these are not his views, he is merely presenting them as a subject for later discussion.

In the course of detailing exactly what is needed for an understanding of numbers Benacerraf reaches an interesting question by considering the applications of the natural numbers, the foremost of which is counting. His question is to ask what counting is and how we learn to do it. Benacerraf presents counting as follows:

There are two kinds of counting, corresponding to transitive and intransitive uses of the verb 'to count'. In one, 'counting' admits of a direct object, as in 'counting the marbles'; in the other it does not.

He explains that learning intransitive counting simply involves learning the appropriate words and how to repeat them in the right order; it is necessary to learn how to be able to generate the words in the right order so that one never runs out. After learning how to count one is no closer to having identified the numbers, having merely given them names in whatever language one happens to be using. This observation clarifies the nature of the problem which concerns us: number names are thoroughly familiar to us, but we have no precise account of what it is that is being named. We must ask what the numbers we refer to are, being careful not to confuse a number with its name.

It would be natural to assume that when we count, for example, we are actually using the numbers or, at the very least, that we are thinking about them. Now, intransitive counting names the numbers, and transitive counting names a particular number associated with some set. The numbers themselves are not involved directly: rather, they are being mentioned but not used. We could maintain the symbol-object distinction between the numbers and their names by taking the view that counting, in common with the other applications of numbers, uses only number names without the numbers themselves. The justification for claiming that we never use the numbers directly is that the number names themselves will serve perfectly well as the set of numbers. The idea of using number names as a system of numbers will be discussed again later on.

The mathematician is quite happy to define each number to be some particular set and then to use numerals to refer to these sets (see footnote 1). In the second part of his essay, Benacerraf points out the principal difficulty with this:

We have two (infinitely many really) accounts of the meaning of certain words ('number', 'one', 'seventeen' and so forth) each of which satisfies what appear to be necessary and sufficient conditions for a correct account.

What he means is that there is no unique choice of sets which is a more correct account than any other (see footnote 2). As Benacerraf states, it is clearly absurd that more than one account is correct and that, for example, 2 = {Ø, {Ø}} and 2 = {{Ø}}. He then sets out to explore the idea that one account does not contain precisely those conditions which are necessary and sufficient but finds that this does not lead to any useful ideas. He concludes that there can be no single such account of what the numbers really are. It would seem then, that the best that such accounts can do is model whatever the true state of affairs may be, in the same way that the theories of physics model the physical universe without claiming to be the last word on the subject. Clearly, being unable to give a definitive account of what the numbers are is not ideal: if the question of what numbers are is part of mathematics then the usual standards of mathematical rigour and certainty ought to apply. Benacerraf puts it this way: 'one who identifies 3 with some particular set does so for the purpose of some theory and does not claim that he has discovered what object 3 really is'. A similar situation arises when one assumes that space is Euclidean for the purposes of geometry; to question this assumption is to miss the point: it does not matter, provided that one accepts that whatever one proves holds only for Euclidean space. Regardless of whether the universe is in fact Euclidean, the results of Euclidean geometry are useful, for they apply to the world we know, and they have a mathematically solid foundation.

The idea that it is not of vital importance to know what the numbers really are is important, because it supports the idea that to discover what the numbers really are would not necessarily ameliorate mathematics. It is enough to suppose that the numbers are an arbitrary example of a suitable structure, such as a simply infinite system, and to prove results which then only apply to these structures - simply infinite systems, say. To assume that the numbers are some example of a particular structure does not limit mathematics because we observe that structures such as simply infinite systems are adequate accounts of the numbers. This is not to dismiss the question but rather to lend some credibility to the idea that to identify the numbers individually may not be possible. Here Benacerraf differs from Frege, who supposes that all statements of the form x = y are meaningful. Instead, Benacerraf prefers to assert that any statement identifying a number with something which is not a number, e.g. 2 = {{Ø}}, is meaningless. However, to claim that any identification of a number with a non-number is meaningless is to assert that numbers do not admit of definition. If this is the case, and numbers are indeed indefinable, then it is difficult to offer any justification for supposing their existence at all: that would be like trying to justify the existence of such a thing as happiness, for example. Mathematics has no place for such vague ideas. Benacerraf does not follow the theory that statements identifying numbers are meaningless very far, for after he presents his views on statements about identity, he wavers between this and another conclusion: he considers the idea that rather than being meaningless, such identities could simply be false, finishing the section with:

What is enticing about the view that these are all false is, of course, that they hardly seem to be open questions to which we may find the answer any day. Clearly all the evidence is in; if no decision is possible on the basis of it, none will ever be possible. But for the purposes at hand the difference between these two views is not a very serious one. I should certainly be happy with the conclusion that all [such indentities] are either senseless or false.

There are two interesting points here. Firstly, to say that all statements which identify the numbers are false seems to lead to the same conclusion as before, viz. that the numbers are indefinable. There is a subtle difference, though. To say that numbers are indefinable is to say that no definition, however plausible, can be shown to be better than all other definitions. To claim that all definitions of numbers are false, however, is a stronger assertion: that such definitions have a well-defined truth value. Benacerraf argues that to claim that number definitions are false seems to be wrong, because there is no justification for thinking so. I would approach the matter from a different angle: a number definition need not be false. For example, suppose that addition and division have been defined, and define 3 = 1+1+1 and 1 = 3÷3. Although this definition is useless, because it is circular, it would be unreasonable to claim that this definition is false.

Secondly, there is Benacerraf's peculiar statement that 'all the evidence is in'. There is no evidence to appeal to, when considering the identity of numbers, so either Benacerraf is confused or he really means that all possible accounts have already been considered. Whether all possible accounts have been considered is far from clear; in fact, since no account admits of formal proof, one would be mistaken to claim that no better account is possible.

It is likely that Benacerraf's uncharacteristic ambivalence and brevity is due to the fact, alluded to in the penultimate sentence, that he considers the whole enterprise of identifying or defining the numbers to be misguided and irrelevant.

At the end of the second section that Benacerraf makes the most important claim of his essay. Before he makes this claim he argues against the idea that numbers are necessarily class predicates, explaining why the arguments for this point of view are not compelling. In particular he does not refute the idea that, for example, the number three is the class of all triples; he merely states that this may not be true. Benacerraf does not discuss numbers as proper (see footnote 3) classes of sets further but goes on to sum up the essential nature of the problem at hand as follows:

Our present problem is to see if there is one account which can be established to the exclusion of all others, thereby settling the issue of which sets the numbers really are. And it should be clear by now that there is not.

He is referring to the fact there are infinitely many set-theoretical accounts of what the numbers are, yet we are unable to pick one account to the exclusion of all the others. It is not yet clear in what sense we are unable to do this. Pursuing the same line of thought, Benacerraf says: 'If numbers are sets, then they must be particular sets, for each set is some particular set', which is the point made earlier about it being absurd that the number 2 might be identical to two distinct sets. The next sentence is:

But if the number 3 is really one set rather than another, it must be possible to give some cogent reason for thinking so; for the position that this is an unknowable truth is hardly tenable.

I do not think that this is necessarily so, for the identity of the number three could be undecidable in the same way as, say, the Continuum Hypothesis. Although not an appealing position, it is certainly a tenable one. Benacerraf covers this point earlier on in his essay, realising that he could be construed as claiming that all mathematical questions are decidable. There his response is that the question is neither a mathematical one nor one amenable to proof. Certainly the identity of the numbers may be no more provable than the axioms of set theory, but I consider the question to be a mathematical one, even mathematicans currently consider it to be outside their field of study.

Now, clearly there ought to be some reason for thinking that 3 might be some particular set for if there were not, it would be equally likely to be some other set. The distinction that needs to be made here is between having reason to suppose that a particular number is a particular set, and being able to prove it. In fact, we cannot prove a particular number is a particular set, or as Benacerraf puts it: '… any feature of an account which identifies 3 with a set is a superfluous one…'. This leads Benacerraf to his ultimate conclusion, viz. that numbers are not sets at all.

The third and final section of Benacerraf's essay, then, offers various justifications for the claim that numbers are not sets, together with Benacerraf's own position on the matter. The basis of the claim that numbers are not sets is essentially that since any recursive progression can be used for the natural numbers, then it is only the structure that is important and not the individual objects themselves. This view sidesteps the question of what objects the numbers are by asserting that they are not objects at all. Rather, the numbers as a whole are an abstract structure whose individual elements have no properties other than their relation to the other elements. Benacerraf's view, then, is that individual numbers do not exist as numbers at all. He says:

'Objects' do not do the job of numbers singly; the whole system performs the job or nothing does. […] The pointlessness of trying to determine which objects the numbers are thus derives from the pointlessness of asking the question of any individual number.

The first sentence accounts for the claim that it is meaningless to ask whether 3 = {{{Ø}}} on the grounds that it is not being asked in the context of the rest of the numbers. Thus 'What are the numbers?' is a more sensible question than 'What is the number three?' provided, as the second sentence above claims, that the question is really 'What structure is "the numbers"?' and not 'What objects are the individual numbers?'. Benacerraf explains that to characterise the numbers is only to describe the structure, without any identification of the individual elements, and that this is why numbers are not objects at all. If numbers are not objects then they are certainly not sets. The idea behind thinking that numbers are not objects is that given some system which has precisely the properties of the numbers, such as the sequence of sets described in footnote 1, then it must be possible to distinguish between the different elements of the system without referring to the role they play in the structure. In the example of the sets of footnote 1, for example, each set is distinguished by how many elements it has. However, the numbers themselves cannot be distinguished in this way, for they have no identity other than the role they play in the overall structure, and hence they do not exist as objects.

Now, to deny the individual numbers the status of objects is to assert that we can only refer to number names. When Benacerraf makes this point he argues that 'there are not two kinds of things, numbers and number words, but just one, the words themselves'. To say this is strange as it is to claim that numbers do not exist, whereas number names do. It is inconsistent to use the name of something which does not exist, since instead of naming some 'thing', the 'thing' to which one refers is just the name. So the number names cannot be the names of the numbers, but something else altogether.

They are, in fact, names which we give to the elements of whichever progression we happen to be using for the numbers. The fact they form such a progression themselves puts them on the same level as any other progression. Thus is would be more correct to say that the number names do not refer the elements of a progression but that number names are the elements of the progression whether it be '0,1,2,3,…'; 'zero, one, two, three,…'; 'Ø, {Ø}, {{Ø}}, {{{Ø}}},…' or 'zéro, un, deux, trois,…'. Perhaps this is not a central issue, but it is useful to have clear and sound ideas about it. Thus '1', 'one', '{Ø}' and 'un' are all number names which play the same role in their respective progressions. However, when, in the quotation above Benacerraf refers to 'the words themselves' he means the words 'zero, one, two, three,…' and then says that 'The central idea is that this recursive sequence is a sort of yardstick which we use to measure sets' as if that particular sequence were distinguished in some way. He justifies this statement immediately afterwards by saying:

Although any sequence of expressions with the proper structure would do the job for which we employ our present number words, there is still some reason for having one, relatively uniform, notation: ordinary communication. Too many sequences in common use would make it necessary for us to learn too many different equivalences.

What he says here is absurd because it completely fails to acknowledge that a multitude of different languages are spoken in the world. Certainly, if the whole world used the same sequence to talk about numbers then communication would be easier, but this is not the case: there is no one system of words or even numerals that is universally understood. For only a mathematician would consider 'zero' to be Ø, and only a francophone would consider it to be 'zéro'. If words are taken to be what is spoken, then the numeral '0' is confined to literates who use Arabic numerals. His second point is also wrong, because despite the proliferation of different sequences in common use around the world, most people only have occasion to use two or three different equivalences.


Footnotes

1 In the modern set-theoretical treatment of numbers, each number is defined to be some particular set, e.g. define zero to be the empty set, and the successor of each number x to be the union of x and {x}; that is, the number x is the set {1, 2, 3, 4,… , x-1} for x > 0.

2 For example, take zero to be the empty set as in the example of the footnote above, but this time take the successor of each number x to be {x}; that is, the number x is the set {x-1} for x > 0. This is John Von Neumann's system.

3 The class of all sets of a given size is not a set, that is, it is a proper class. This is the biggest failing of the theory that 3, for example, is the set of all triples, because proper classes are not useful mathematically.

Share on TwitterShare on LinkedIn