finitism, expression, complexity, comparing, feasibility, feasible, functional expression, foundations, feasible numbers, mathematics, natural numbers, unfeasible, philosophy

My personal page - click here.

Any comments are welcome - e-mail to Karlis.Podnieks@lu.lv

Towards a Real Finitism?

By K. Podnieks

This work is licensed under a Creative Commons License and is copyrighted © 2005-2011 by me, Karlis Podnieks.

[Inspired in part by reading Doron Zeilberger's Opinion 57 - http://www.math.rutgers.edu/~zeilberg/Opinion57.html.]

We see a minority among mathematicians who do not accept the actual infinity while accepting the potential one. This resembles to me "platonism with respect to natural numbers" as opposed to unrestricted set-theoretical platonism. Still, should "subtle differences" play an important role in philosophy? Shouldn't a useful philosophy be "robust"? So, why should we stop at rejecting only one kind of infinity?

In this way some investigators have already arrived at various forms of the so-called ultra-finitism. But isn't their starting point mainly negative – rejecting infinity? A kind of positive product could be the notion of feasible natural numbers. But, is this structure rich enough to become really useful? So, couldn't non-infinitism be a better term for this approach?

Shouldn't a real finitism be more radical and starting positively? Couldn't we obtain a richer structure by replacing the traditional "linear" notion of natural numbers by the notion of functional expressions?

Of course, this idea is not new - its history may be traced back to 1934 - to Bernays' number 67^257^729 (where ^ stands for exponentiation): "What does it mean to claim the existence of an Arabic numeral for the foregoing number, since in practice we are not in a position to obtain it?"

But what, if we would go further and claim that expressions are more fundamental than natural numbers? And that natural numbers are simply expressions based on 1 and +, i.e. 1, 1+1, 1+1+1 etc.? Binary numerals can be regarded then as expressions based on 1 and two functions - 2x and 2x+1. And, in general, a "functional base" could consist of any small list of natural numbers and any small list of computable arithmetical functions. Of course, to complete this argument consistently, we should completely remove the traditional natural numbers "from behind"...

But, as a first approximation, let us still use the traditional natural numbers - simply to allow comparing of expressions by their natural number "values".

Imagine, we have two different functional bases (for example, {1, +, *, ^} and {1, +, *}), how complicated is the task of converting expressions of Base 1 into equivalent expressions of Base 2? And how complicated is the task of comparing the "values" of two expressions belonging to the same functional base? As Bernays' example shows, the conversion task may be unfeasible. But then, how complicated could be the task of comparing two feasible expressions having unfeasible "values"?

As the first step, I would propose the following 3 problems:

Problem 1. How complicated is comparing the values of two expressions based on {1, +, *}, the longer expression having the length n?

Problem 2. The same for expressions of base {1, +, *, ^}.

Problem 3. Build a functional base for which the complexity of Problem 1 belongs to a given complexity class.

Are the solutions of these problems already known? Are they trivial? I was not able to find the answer on the Internet. But the idea was at least touched on the FOM list in October 1998:

by Harvey Friedman: "THEOREM 3. n(3) > A_7198(158386)." (see http://www.cs.nyu.edu/pipermail/fom/1998-October/002339.html);

by Anatoly Vorobey: "You didn't, after all, prove that n(3)=A100(200); but you did prove that n(3)>A7(184). What does *that* mean?... So, a finitist/feasabilist may ask, what kind of information, insight, knowledge, does n(3)>A7(184) offer you?" (seehttp://www.cs.nyu.edu/pipermail/fom/1998-October/002367.html);

and by Vladimir Kanovei: "Most likely 9^9^9... (8 times) < 8^8^8...(9 times). If so what does it mean?" (see http://www.cs.nyu.edu/pipermail/fom/1998-October/002370.html).

See also web-pages by Robert P. Munafo: "We cannot compute the exact values of these two numbers and compare directly - they have way too many digits to store the values on a computer." (http://mrob.com/pub/math/largenum-2.html)

and a posting of Adam: "Comparing Very Large Numbers--Which Is Bigger?... How do you know whether 10^10!^10 is bigger than 9!^9!^9?" (http://mathforum.org/library/drmath/view/66156.html).

But the very first could have been J. E. Littlewood (sorry, my own translation back from Russian): "The reader will agree that we have written large numbers; however, it's hard to imagine how large they are; all that can be said about them - that they are determined by their definitions. If we would need comparing of two numbers from two competing systems, then creating of a serious mathematical aparatus may be needed." (see his paper "Large Numbers", Mathematical Gazette, 32 #300, July 1948. Reprinted in LITTLEWOOD'S MISCELLANY.)

December 28, 2005

Posted on the FOM list, December 29, 2005 - see http://www.cs.nyu.edu/pipermail/fom/2005-December/009504.html.


[Added January 24, 2006. Inspired in part by discussions with David Isles and Vladimir Sazonov.]

Couldn't the thesis "expressions are more fundamental than natural numbers" be formalized as the following version of the induction schema:

F(0) & AxAy[F(x)&F(y) -> F(Sx)&F(x+y)&F(x*y)&F(x^y)&...] -> AzF(z),

i.e by including all the function constants allowed in the language?

In the traditional schema:

F(0) & Ax[F(x) -> F(Sx)] -> AzF(z),

only the successor function is used, and this allows easy proving of algorithmically unfeasible properties of natural numbers. For example, one can prove by (the traditional) induction that

~(x=0) -> Ey(x=Sy),

but, for expressions built of 1, +, *, ^, the task of finding y may be unfeasible.


[Added February 2, 2006]

Andras Kornai. Explicit Finitism. International Journal of Theoretical Physics, 2003, N 2, pp. 301-307 (online copy available):

"To summarize, it is not the raw size, but rather the information content of a number that determines its accessibility. The key issue here is randomness: powers of 2, towers of such powers, and in general the values of the Ackermann function, are far from random, both in the technical sense of Kolmogorov complexity, and in the more physical sense of randomness that we will first rely on. Some numbers x are more accessible to our arithmetic than other numbers in the sense that we can establish certain arithmetic statements D(x) but not D(y). It follows from the cardinality argument that we must have many numbers z about which nothing but trivial arithmetic facts, such as 0 * z = 0 can be established."


[Added August 31, 2006]

If, trying to formalize ultra-finitist concepts, we will allow "arbitrary long" chains of applications of inference rules, then this may cause a kind of vicious circle.

Doesn't this mean that, in general, unrestricted repeated application of standard logical inferences may lead to incorrect results? Always, or only in some situations?


[Added May 27, 2011]

Indeed, rules of inference, logic(s), any means of reasoning are no more than (more or less successful) means of building and exploring models selected by trial and error – as any other knowledge constructs.

For details see my paper: The Formalist Picture of Cognition. Towards a Total Demystification.

There is no such thing as “universal logic” allowing to draw “true” consequences from “true” axioms. Hence, why should we believe that unrestricted repeated application of inferences will always lead to meaningful results? It never did and it never will! (Frederick Douglass)


My personal page - click here.

finitism, expression, complexity, comparing, feasibility, feasible, functional expression, foundations, feasible numbers, mathematics, natural numbers, unfeasible, philosophy