I’ve been reading more of Maddy’s book *Naturalism in Mathematics*, and she gives an interesting case history for a realist argument against Gödel’s axiom of constructibility, V=L. She does this by giving parallels with the development of physical science from Mechanism to a broader Physicalism. I think the analogies might well have something to say in other philosophical debates as well.

Apparently, in the wake of Gaileo’s and Newton’s successes in explaining parts of nature mechanically, there was a thought that everything in nature could be explained in terms of particles acting only on one another, in the direction of the line segment separating them, with forces that depend only on their distance from one another. Eventually, with the kinetic theory of gases, this model was able to explain a huge range of phenomena. However, with Oersted’s experiments on how electric currents affect a compass, it became clear that some forces can act perpendicularly and with a strength that depends on the speed of the current and not just the distance. Though ad hoc modifications were able to temporarily save Mechanism, it eventually became clear that the electromagnetic field was required in addition to the particles. Thus, although Mechanism had succeeded for a long time, we eventually needed to broaden our scope of theories, and this became Physicalism.

Maddy then describes a similar development in mathematics from a position she calls Definabilism to one she calls Combinatorialism. Definabilism is the idea that all functions (and in a sense, all objects) capable of mathematical study are somehow definable. Combinatorialism is the more modern picture, where absolutely arbitrary functions and objects are permitted. Descartes apparently considered only algebraically definable curves when he invented the idea of coordinate geometry. Later, in investigating the behavior of vibrating strings using partial differential equations, D’Alembert noted that the behavior of an actual plucked string couldn’t be modelled by his theory, because the initial condition (when the string is shaped like two straight lines with an angle between them) wasn’t a mathematical function. But Euler and one of the Bernoullis were eventually able to show that this function could be described properly (as the sort of piecewise function we’re familiar with from high school calculus textbooks today) and the differential equation could still be solved. Fourier showed how to represent such a function as a sum of sines and cosines, and conjectured that all such functions could be so reprsented. But as people worked towards proving this theorem (and meanwhile rigorizing calculus), they started coming up with new counterexamples, leading eventually to the pathological functions considered in any real analysis class today (the function that is 1 on the rationals and 0 on the irrationals, the one that takes value 0 on the irrationals and 1/q on any rational p/q in lowest terms, Weierstrass’ function that is continuous everywhere and differentiable nowhere). Still, these functions were all definable, in some suitably general sense. But after the work of Cantor in analyzing the sets of points of discontinuity of such functions, the French and Russian analysts were eventually able to classify the Borel and analytic sets, and showed by cardinality arguments that there must be still stranger functions. Thus, they gradually adopted the modern Combinatorialist approach, whereby a function is just an arbitrary association.

Maddy suggests that adopting V=L would be a return to Definabilism, and in fact a very restricted form of it (where only predicative definitions are allowed). Instead, because of adopting the Combinatorialist position, we should opt for “maximizing” rather than “minimizing” principles in set theory.

I think the situation here should be somewhat familiar from debates in metaphysics. The two I’m thinking of in particular are about the range of possible worlds that exist (they can be ersatz worlds), and about what collections of objects compose a further object. I’m not sure what most metaphysicians think is the range of possible worlds that exist, but I think most people think it’s more than just the worlds compatible with actual physical laws, and less than the set of all logically possible worlds. (Of course, there’s also the Australians who believe in impossible worlds as well.) And in mereology, we have the debate between fans of unrestricted composition and those that argue for a more moderate answer to the compositions question. If there are enough analogies with the mathematical situation that Maddy describes, then perhaps we should adopt the most permissive answer possible in each case. I tend to think however that there should be some restriction in all three cases (set theory, possible worlds, and composition). The restrictions I would pose are much more permissive than V=L, or merely finite composition in mereology. But I don’t think there’s a coherent way to say “any collection of things forms a set” or “any collection of objects composes a further object” without running into the set-theoretic paradoxes. But I have no idea how one could phrase this sort of middle way that I propose (which most people would probably still consider to be quite extreme towards the universalist picture).

Greg Frost-Arnold(01:35:16) :One option on the table for those who would like to take “the most permissive answer possible” has been advocated by Carnap and van Fraassen; however, I’m not sure their strategy can be made to work in set theory (see next paragraph). In The Empirical Stance, vF suggests that physicalism and/or materialism should not be taken as assertions (and thus capable of being true or false), but rather as a stance, which is a set of dispositions to act in certain ways (and is thus incapable of being true or false). Carnap said basically the same thing about the status of empiricism in “Testability and Meaning”: it’s not a thesis but a proposal or stance. Carnap and vF both hold this view in part because certain theses intended to capture the essence of empiricism — e.g. ‘All knowledge comes ultimately from experience’ and (more clearly) the verification criterion of meaning — look like they might be self-defeating. And this situation could be considered analogous to the paradoxes of naive set theory. So one avenue someone with ‘maximizing’ proclivities could consider is converting assertions (which could lead to paradoxes) into stances.

However, an appeal to stances might not work for set theory, even if such an appeal is acceptable in the case of materialism/ physicalism. Why? Set theory has axioms for set production, i.e., how to generate a new set from given ones. There is no analogue that I can see in the case of the physical sciences: there is no postulate of the theory that lets you generate electrons (that actually exist — of course, certain systems that the theory declares physically possible can be combined to make a new physically possible system, but that new system may not actually exist). So set theory, unlike physics, contains within its fundamental postulates claims about which particular entities of its chosen subject-matter exist. So in set theory, we may not have the option of converting our maximizing or minimizing tendencies from an assertion to a stance.

And if that’s right, and if van Fraassen is right that materialism and physicalism should be considered stances, then Maddy’s nice historical analogy breaks down somewhat — because I don’t think that the ontology of fundamental physics itself can be meaningfully said to be monotonically increasing since the 17th C: certain things that we thought existed have been shown not to exist, as well as the other way around. (No modern physicist believes that Gassendi’s atoms exist.)

Kenny Easwaran(18:38:09) :That’s an interesting point you make – Mechanism and Physicalism are methodological stances, while Definabilism and Combinatorialism are closer to actual axioms or propositions. Stances are just useful or not useful, rather than being true or false. However, I think there still is something to Maddy’s analogy. Combinatorialism is about more than just an axiom for set theory – it’s a methodology for considering functions in analysis and other things as well. There’s no specific axiom it suggests one adopt, but it does suggest denying V=L (so I suppose it means adopting the axiom V!=L). But there is still a further disanalogy – the methodologies we discuss in physics seem to say things of the form “every phenomenon is explained by some material particles acting on one another directly” or “every phenomenon is explained by interactions between fields and particles”, and are thus limitative. They tell us that the new things we propose should be of such and such a form and not something else. Combinatorialism as stated though seems to tell us that “everything exists” or something of the sort, rather than putting any limitation on things. It doesn’t seem to be as useful a way of directing progress in the relevant science.