Rosen Lutskanov
Changing rules as we go along
I.
1.von Neumann: the impact of Gödel’s results upon the intellectual climate in Europe may be compared only with the impact of Einstein’s relativity theory. 2. Hofstadter: the incompleteness results may be blamed for the general expectation for limitative results.
II.
1. Gödel [1932]: a. the solution of certain arithmetical problems requires use of assumptions essentially transcending arithmetic (i.e. the domain of indisputable evidence) – falsifiability b. new axioms will be necessary – open endedness c. arithmetic loses its epistemic dignity – epistemic insecurity 2. The dynamics of mathematical knowledge derives from mathematical ontology (set theory): a. cardinality axioms (strong axioms of infinity) b. the reflection principle: there is no definable characteristic of the universe of set theory c. Cantor’s third generative principle 3. Gödel: the axioms of set theory do not form a system closed in itself, because the sets form a totality which cannot be defined form the outset (iterative conception of sets)
III.
1. Epistemic insecurity and ontological instability of contemporary mathematics – the foundational crisis 2. Hilbert’s foundational program cannot be accomplished in its original form: no axiomatic system is encompassing (umfassend ist) 3. Putnam: dismiss foundational talk
IV.
1. Hilbert: the whole modern culture is grounded upon mathematics 2. It is a general trait of epistemic progress that it endangers even the most fundamental principles (Poincare)
3. Ergo: no statement is immune to revision (Quine) V.
1. That’s why the dominant trend in philosophy of mathematics is empiricism (Detlefsen) 2. Lakatos: the progress of mathematical knowledge through critical speculation (Proofs and refutations) is the same as the progress of empirical science 3. Toulmin: we must go beyond the static picture of knowledge (both in empirical and deductive sciences) 4. The relevance of Toulmin’s remark in mathematics can be seen for example in Feferman’s ordinal progressions of logics, presented in his ASL2000 address.
VI.
1. If we take seriously the thesis of the quasi-empirical character of mathematics, we must search for formal ways to model its dynamics. The standard approaches are not appropriate: a. The dynamic logic approach has nice computational semantics but we must model the whole classical mathematics, not only its intuitionistically valid (i.e., effectively calculable) part. b. The AGM-approach models the epistemic behavior of ideally rational agents confronted with expanding empirical information, but it does not fit to situations in which the growth of information is brought by the utilization of deductive principles c. The Ghent-approach makes possible the contruction of a rich variety of adaptive logics, some of which were used in foundational context, but it does not provide us with adequate picture of the sources of the dynamics of mathematical knowledge. 2. This source is, of course, the self-reflexivity of the sufficiently rich formal systems. So we need foundational framework which produces dynamics through self-reflexivity. 3. At first this sounds strange, because the whole development of modern mathematics was brought by the necessity to ban self-reference. But Gödel has shown that the most respected solutions (Russell’s Type Theory and ZermeloFraenkel’s Set Theory) are too drastic. According to Kripke, Gödel has put the issue of the legitimacy of self-referential sentences beyond any question.
4. Recent advances in set-theory show that Kripke’s remark is true: Aczel’s set theory, for example, in which the axiom of foundation is replaced by the axiom of anti-foundation, implying the existence of non-well-founded (possibly circular) sets, was provided with relative consistency proof: it is consistent if ZF is. VII.
1. Luckily, we have a representative recent attempt to build a theory of truth which embraces the unavoidable possibility of self-referential constructions: the revision theory of truth. 2. Gupta’s revision theory is founded on the observation that the standard solution of the paradoxes – their elimination by banning self-referential constructions – leads immediately to undefinability results like Tarski’s classical arithmetical undefinability theorem (and the rest limitative results) 3. Gupta’s thesis says, that we must acknowledge the fact, that the domain of the meaningful is much more encompassing than we have previously thought, i.e. that the self-referential (paradoxical) constructions are perfectly meaningful. 4. In fact, Gupta considers the possibility that the concept of truth is also selfreferential (the constant re-emergence of Liar-type construction motivates strongly this claim). This means, that it cannot be characterized by application procedure, but by revision procedure, which makes possible for us to produce better estimates of its extension, one we are provided with a hypothesis about this extension. 5. Every circular construction may be interpreted as a rule of revision which defines some particular revision sequence. If the construction is viciously circular (pathological), then this sequence does not terminate. The important fact is that not all circular constructions are vicious. 6. The revision sequences which satisfy some general conditions (known as Thomasson’s conditions) possess two characteristics: a. they settle after finite number of applications of the revision rule; b. they converge to an unique fixed point, no matter what the initial hypothesis was. 7. In this way, revision theory provides an explanation of logical and mathematical necessity: necessary are these propositions which are to be reached
in the revision sequence come what may, i.e. which are unavoidable in some perfectly defined sense. 8. Such reconstruction of necessity is in accordance with Plantinga’s observation that necessity is not the same thing as unrevisability or ungiveupability. VIII.
1. Revision theory is bright example of dynamic semantic theory of selfreferential constructions. But if we want a general dynamic framework for mathematical knowledge, we must supply revision theory with a properly syntactic counterpart. 2. I suppose that Wittgenstein’s philosophy of logic may provide us with valuable intuitions which will help us to accomplish this task. 3. Let’s start with a thesis that Wittgenstein posed already in his 1931 lectures: that meaning is use. If we ask ourselves what is the use of a particular proposition which inhabits a particular deductive framework we may answer that we may use the proposition to prove another propositions. So, the meaning of every proposition may be excavated from the bulk of the proofs that rely on it. But we may use a proposition to prove other propositions only if it has been already proved. So, ultimately, the meaning of every proposition is contained in the meaning of all sets of propositions that are sufficient to prove it. 4. In this setting, the construction of a new proof in a deductive system transforms the system itself: it establishes new connections between the propositions and expands their deductive uses. 5. But this in its turn changes the proof itself, because it is a particular arrangement of propositions and is functionally dependent on their meaning. This means that we have here a self-adjustment of the rules of proof in the system, i.e. of the rules according to which the proofs are constructed. In his Philosophical Investigations Wittgenstein considered such an idea (the possibility to “change the rules as we go along”) in the context of the mentioned circular dependence between the meanings of the proof and the propositions occurring in it.