Azzouni on Deflation

27 07 2005

Now that I’ve finished Maddy’s Naturalism in Mathematics, I’ve started reading Jody Azzouni’s recent book, Deflating Existential Consequence, which apparently tries to argue that although existential claims about mathematical entities (and many other entities) may be indispensable to our best scientific theories, this doesn’t mean we’re “ontologically committed” to them. I suppose I’ll get into that stuff later, but for now I’m reading his early chapters about truth.

He suggests that we must be deflationists about truth in order to use the truth predicate the way we do. One of the important uses, he suggests, is “blind ascription”, which is when I say something like “What Mary said is true”, rather than actually exhibiting a sentence in quotation marks followed by “is true”. It’s clear that we have a reason to engage in blind ascription of truth for a variety of reasons in our scientific theorizing, either when we talk about the consequences of an infinite theory (or at least an unwieldy and large one), or when we use a simplified version of a theory to make a calculation (like replacing “sin t” by “t” in calculating the period of oscillation of a pendulum) and suggest “something in the vicinity of this result is true”. In order for blind ascription to work, he suggests, we need to have a theory of truth that endorses every Tarski biconditional of the form “‘__’ is true iff__”. But he suggests that only the deflationist about truth can really do this.

The problem is that any supplementation of deflationist (Tarski-biconditional governed) truth faces a dilemma. Either it falsifies a Tarski biconditional (and so proves unfit for blind ascription), or it fails to be a genuine supplementation of the deflationist notion of truth.

As an example, he considers the requirement one might have that a certain type of compositionality holds. That is, “snow is white” is true just in case there is some object that “snow” refers to and a predicate referred to by “white” that applies to that object. If this requirement goes beyond the requirement of the biconditional, then such a compositional notion of truth will be unfit for blind ascription. But if it doesn’t, then he says that this requirement is “toothless”, and doesn’t get us a notion of truth any different from the deflationist.

This latter seems to me to be wrong though. Both Davidson (in “Truth and Meaning”) and Field (in “Tarski’s Theory of Truth”) apply such a requirement to the truth predicate. While Davidson seems to be happy to take a deflationist account of truth, and then use the compositionality requirement to explicate the notions of reference and meaning, Field seems to do something different. Field (at least at the time of that paper, and probably into the mid-80’s) wanted a physicalist explanation of the notions of reference and meaning for individual words, and then used the notion of compositionality to define truth. Then, using the Tarski biconditionals, we can understand just what our ordinary sentences commit us to, and we have used truth as a step in the understanding of language, rather than using understanding language as a step in explaining reference as Davidson wanted.

To see that Field’s notion of truth in this case isn’t just deflationary, I point out a usage I believe Field mentions in a much later paper (“Correspondence Truth, Disquotational Truth, and Deflationism”, from 1986). This is the example that convinced me not to be a deflationist about truth, though ironically I hear that Field became one very soon afterwards. But basically, for the deflationist, the Tarski biconditionals are constitutive of the meaning of the truth predicate, so “‘Grass is white’ is true” has just the same content as “Grass is white”. Similarly, “‘Grass is white’ might have been true” is the same as “Grass might have been white”. To see the problem, we note that as a result, “If we had used our words differently, “Grass is white” might have been true” comes out the same as “If we had used our words differently, grass might have been white”. But intuitively, the former is correct and the latter not, so the deflationist must be wrong. I think the compositional account of truth gets this right, and the biconditionals are then useful only to understand how language works and to establish the the practice of blind assertion.

So I think on this point, Azzouni is not quite right, and we can have a non-deflationary position that asserts all the same biconditionals, and is thus fit for blind assertion, and thus for science. In fact, I think there’s reason to believe that this is the right sort of truth theory, but of course that’s quite hard to argue. He has a footnote that points out what I take to be the Davidsonian position, but I think he might miss the Fieldian position. Unless I’m wrong about what this footnote is supposed to mean.





Recreational Mathematics?

25 07 2005

One of Penelope Maddy’s objections to the indispensability argument as the justification for mathematical practice is that it seems to make set theory a hostage to quantum gravity. That is, if set theorists are realists, and are justified in their position because their work is an indispensable part of fully specifying the mathematical theories that are an inherent part of physics, then they should be eagerly awaiting the results of physical research to find out just what things need to be included in order to support the physics.

A potential response is to notice that the form of this argument is just the same as the argument against Quine’s web of belief that points out that “2+2=4” is not going anywhere. I think Maddy is right to notice that higher set theory is at least more vulnerable to this sort of attack than basic number theory. I would be hard-pressed to imagine a version of science that doesn’t apply basic number theory, while the motivations for set theory stem from deep reasoning about the continuum and about various other transfinite objects.

Mark Colyvan responds by suggesting that in fact, some mathematics may not be applied. Such mathematics carries with it no ontological commitments, and he calls it “recreational mathematics”. (I don’t recall if Maddy used this term as well.) He suggests that set theorists may be free to investigate set theory however they want, because part of it will be applied, and part will be purely recreational.

However, this doesn’t seem right to me. Set theory seems to have a fairly unified methodology (ignoring the fact that Californians work on extending large cardinal axioms and sloving CH, while east coasters and Israelis do something else), and this applied/recreational divide would cut across this work in some totally unexpected way. It doesn’t seem plausible that this divide between the applied and the recreational could be important enough to base ontology on, but unimportant enough that practitioners don’t even notice it.

I think that it is far more likely that either all set theory falls on the applied side, or almost all of it falls on the recreational side. (I say “almost all”, because I could imagine the countable being seen as applied while the uncountable is recreational.) It seems that once one adopts ZFC, there are important reasons to adopt large cardinal axioms. Maddy gives these explanations quite well (see her “Believing the Axioms” parts I and II in the 1988 Journal of Symbolic Logic) and I think both she and Colyvan are unnecessarily worried that proper scientific methodology might only pick out some of this. The real worry about indispensability will come much lower down, and I think the argument may well be found wanting at that level. But rather than considering the rest of mathematics to be “recreational”, a Fieldian fictionalist position will be the right attitude to take. In practice, this should be no different from the realism that Colyvan supports, or the agnosticism that Maddy seems to endorse now.





Academic Blogging

24 07 2005

The philosophy blogosphere has gotten extremely large now (just check out Dave Chalmers’ list of philosophy blogs in the link on the sidebar). There are so many now that there are even relatively well-defined subcommunities, like the collection of logic blogs. But at the same time, it’s become clear that just as philosophy is far ahead of most other academic fields in terms of number of blogs, certain subfields of philosophy seem to be pulling ahead as well. For instance, I often stumble across blogs talking about epistemology or language, but much less often metaphysics.

In other academic disciplines this seems to be repeated. Most of the physics blogs I’ve ever stumbled across have been about string theory. I can’t help but wonder if these concentrations of blogs will reshape the disciplines. (Certainly Brian Leiter and Brian Weatherson are both more well-known than they would be without their online presences, though in both their cases, their target audience is a broad range of philosophers, rather than just epistemologists, or philosophers of language, or logicians, or philosophers of mathematics, or whatever).

Just as prominent females have helped make certain subdisciplines more gender balanced (for instance, Vera Serganova seems to have attracted a large proportion of the female math grad students at Berkeley to her area), I’m sure early-adopting bloggers will have some shape on the way methods of communication develop in certain subfields.

But as academic blogging matures, we’ve got important methodological questions to consider, in addition to these sociological ones. An interesting post I meant to link to a while ago by Anthony Widjaja To suggests some important starting points. Related to those, I’ve often wondered what balance I should have between stating a question most simply and giving enough background information to allow a much larger audience to read. For instance, when I planned my last post, it was going to be two sentences, until I realized that I should probably say what surreal numbers are. But meanwhile, a lot of my posts also react to some book or paper I’ve been reading without much explanation of what the author was actually saying. I’ve been more open to doing that with important papers by figures like Quine and Lewis than recent work in philosophy of mathematics, but I think I’ve also had some discussion of dense papers by Dummett that I can’t assume many people to have read. I think this issue relates quite closely to Anthony’s suggestions for reshaping academic discussions for the blog format.

The more direct inspiration for this post was a suggestion for The Blog as a Sharp Tool for Research by physicist Clifford Johnson at one of those string-theory-ish blogs I mentioned. He suggests an interesting model whereby a blog (or blog-like enterprise) can help focus discussion in a subfield by changing hands periodically, allowing different individuals or groups working on some field to be the host at different intervals. I believe Left2Right and Certain Doubts originally intended to have a function somewhat like this, but based on the natural tendency for some participants in a group blog to post more and some to post less, they’ve ended up being more like much smaller blogs by David Velleman and Don Herzog at L2R, and Jon Kvanvig at Certain Doubts. This new model of giving each person or group a specific period to be in charge seems like it might work better at achieving this end. Also, it will probably encourage more posting about specific breaking research, as each person or group will only have to talk about their own stuff for a small amount of the time, rather than scattering it constantly between posts about smaller thoughts the way it happens on individual blogs. Such a model shouldn’t replace the individual blogs we have, of course, but it would allow for yet another type of discussion. (I should also at least briefly mention the Philosophers’ Carnival started by Richard Chappell, though that serves yet another function, to help more widely circulate the discussions in the large philosophical blogging community, rather than to develop one particular subfield.)





Surreal Numbers and Set Theory

23 07 2005

Here’s a more purely mathematically-oriented post.

John Conway has developed a class of entities he calls “surreal numbers”, which he describes in his book On Numbers and Games, and I believe in some simpler sources as well. He developed them originally to describe combinatorial games, but later noticed that they generalized both the von Neumann construction of ordinals and the Dedekind cut construction of reals. Each surreal number is an ordered pair of sets of surreal numbers. The simplest one has the empty set on both sides, and is treated as the number 0. We can define an ordering on the surreals recursively, saying that x>y if there is z in the left set of x such that z=y or z>y, and we say that x<y if there is z in the right set of x such that z=y or z<y. These orderings aren’t exactly duals of one another, and aren’t always transitive, but they turn out to generalize the standard notions. If we let the right set remain empty, and let the left set be {0,1,…,n}, then we get the number n+1. To get the negative natural numbers, we basically reverse the construction, letting the left set be empty and setting the right set as {0,-1,…,-n} to get -(n+1). We can generalize both of these constructions to the transfinite in the usual way, letting the positive transfinite ordinals be those whose left set contains all smaller ordinals and whose right set is empty, and dually for the negatives. We let two surreal numbers be equal if each has elements in its left set that are at least as large as any element in the left set of the other, and each has elements in the right set at least as small as any element in the right set of the other. Thus, each integer could be represented with a singleton instead of the standard set. We generate rationals whose denominator is a power of 2 by letting ({x},{y}) represent (x+y)/2, whenever x and y differ by 1/2^n for some n. Thus, 1/2=({0},{1}), 3/4=({1/2},{1}), and 17/64=({1/4},{9/32}). From these, we define all the real numbers by the Dedekind cut construction.

It turns out that one can define addition and multiplication in a way that preserves the expected structure. But in addition to all the reals and ordinals, we get a lot of weirder things. For instance, we get omega-1 as ({0,1,2,…},{omega}), since this is the simplest surreal greater than every natural but less than omega. We can use similar constructions to get omega-2, omega-3, and even things like omega/2 and the like. We can also get 1/omega as ({0},{1/2,1/4,1/8,1/16,…}) as the simplest number greater than 0 and less than every positive real. By even odder constructions we get the square root of omega, the omegath root of 2, and all sorts of other crazy structure. I believe the surreal numbers end up being a model of the theory of real numbers under addition, multiplication, and exponentiation, but with the cardinality of a proper class. (This ignores complications arising from surreal numbers like ({0},{0}) which is both greater and less than zero).

Now, whatever these surreal numbers are, the class of all of them is nicely definable. We let S_0={({},{})}, the set containing just 0. Then, we let S_(a+1) be the product of the powerset of S_a with itself, so that we get the set of all ordered pairs of sets of surreal numbers in S_a. We just take unions at limit stages. These sets are definable by transfinite recursion, so any model of ZF containing all the ordinals must contain all of these, assuming the powerset operation is the correct one in this model. Thus, the class S of all surreal numbers bears a resemblance to Gödel’s class L of all definable sets, which is the smallest model of ZF containing all the ordinals. An important object of study in contemporary set theory is the model L(R) which is the smallest model of ZF containing all ordinals and all reals. Since the reals and ordinals can easily be defined from the surreals, any model containing all the surreals must include L(R). But now I wonder if L(R) itself contains all the surreals. This would be a nice characterization of L(R), and might lead to some interesting results.





More Blogs

22 07 2005

This week there were several undergraduate philosopher students visiting the ANU at least in part for a conference on philosophical methodology. Among these students was Richard Chappell of Philosophy, etc. It’s always exciting to meet more philosophy bloggers in person. And the other students I met were quite nice and interesting as well, and I’ll have to continue my discussions with them when I visit Brisbane and Melbourne over the next few weeks of my stay in Australia.

And my most recent post got a comment from Greg Frost-Arnold of Obscure and Confused, a new philosophy of science blog. I suppose it isn’t technically a logic blog, but it’s still enough in my general area that I’m glad to welcome him to this little part of the philosophy blogging community. He’s got quite an interesting post already about just what naturalism about metaphysics is supposed to mean, and another nice one about when inference to the best explanation is justified in philosophical contexts.





Definabilism and Combinatorialism

18 07 2005

I’ve been reading more of Maddy’s book Naturalism in Mathematics, and she gives an interesting case history for a realist argument against Gödel’s axiom of constructibility, V=L. She does this by giving parallels with the development of physical science from Mechanism to a broader Physicalism. I think the analogies might well have something to say in other philosophical debates as well.

Apparently, in the wake of Gaileo’s and Newton’s successes in explaining parts of nature mechanically, there was a thought that everything in nature could be explained in terms of particles acting only on one another, in the direction of the line segment separating them, with forces that depend only on their distance from one another. Eventually, with the kinetic theory of gases, this model was able to explain a huge range of phenomena. However, with Oersted’s experiments on how electric currents affect a compass, it became clear that some forces can act perpendicularly and with a strength that depends on the speed of the current and not just the distance. Though ad hoc modifications were able to temporarily save Mechanism, it eventually became clear that the electromagnetic field was required in addition to the particles. Thus, although Mechanism had succeeded for a long time, we eventually needed to broaden our scope of theories, and this became Physicalism.

Maddy then describes a similar development in mathematics from a position she calls Definabilism to one she calls Combinatorialism. Definabilism is the idea that all functions (and in a sense, all objects) capable of mathematical study are somehow definable. Combinatorialism is the more modern picture, where absolutely arbitrary functions and objects are permitted. Descartes apparently considered only algebraically definable curves when he invented the idea of coordinate geometry. Later, in investigating the behavior of vibrating strings using partial differential equations, D’Alembert noted that the behavior of an actual plucked string couldn’t be modelled by his theory, because the initial condition (when the string is shaped like two straight lines with an angle between them) wasn’t a mathematical function. But Euler and one of the Bernoullis were eventually able to show that this function could be described properly (as the sort of piecewise function we’re familiar with from high school calculus textbooks today) and the differential equation could still be solved. Fourier showed how to represent such a function as a sum of sines and cosines, and conjectured that all such functions could be so reprsented. But as people worked towards proving this theorem (and meanwhile rigorizing calculus), they started coming up with new counterexamples, leading eventually to the pathological functions considered in any real analysis class today (the function that is 1 on the rationals and 0 on the irrationals, the one that takes value 0 on the irrationals and 1/q on any rational p/q in lowest terms, Weierstrass’ function that is continuous everywhere and differentiable nowhere). Still, these functions were all definable, in some suitably general sense. But after the work of Cantor in analyzing the sets of points of discontinuity of such functions, the French and Russian analysts were eventually able to classify the Borel and analytic sets, and showed by cardinality arguments that there must be still stranger functions. Thus, they gradually adopted the modern Combinatorialist approach, whereby a function is just an arbitrary association.

Maddy suggests that adopting V=L would be a return to Definabilism, and in fact a very restricted form of it (where only predicative definitions are allowed). Instead, because of adopting the Combinatorialist position, we should opt for “maximizing” rather than “minimizing” principles in set theory.

I think the situation here should be somewhat familiar from debates in metaphysics. The two I’m thinking of in particular are about the range of possible worlds that exist (they can be ersatz worlds), and about what collections of objects compose a further object. I’m not sure what most metaphysicians think is the range of possible worlds that exist, but I think most people think it’s more than just the worlds compatible with actual physical laws, and less than the set of all logically possible worlds. (Of course, there’s also the Australians who believe in impossible worlds as well.) And in mereology, we have the debate between fans of unrestricted composition and those that argue for a more moderate answer to the compositions question. If there are enough analogies with the mathematical situation that Maddy describes, then perhaps we should adopt the most permissive answer possible in each case. I tend to think however that there should be some restriction in all three cases (set theory, possible worlds, and composition). The restrictions I would pose are much more permissive than V=L, or merely finite composition in mereology. But I don’t think there’s a coherent way to say “any collection of things forms a set” or “any collection of objects composes a further object” without running into the set-theoretic paradoxes. But I have no idea how one could phrase this sort of middle way that I propose (which most people would probably still consider to be quite extreme towards the universalist picture).





Maddy’s View of Axioms

16 07 2005

In chapter I.2 of Naturalism in Mathematics, Penelope Maddy suggests that the role of set theory in mathematics is not ontological, metaphysical, or epistemic. Instead, she suggests that it unifies disparate areas of mathematics (allowing us to see that Zorn’s Lemma in algebra and the Axiom of Choice in analysis and the Well-Ordering Principle of set theory are all in fact the same thing), helps coordinate division of labor, and eases explanation of intuitive results like the Jordan Curve Theorem. To play these roles, the axioms just need to be fairly certainly consistent and need to be sufficient to provide surrogates for the rest of mathematics. There is no need for them to be seen as an ontological analysis of what things exist (ie, just sets), a metaphysical analysis of mathematical entities (ie, natural numbers are von Neumann ordinals, or Fregean extensions), or epistemologically more certain than the intuitive premises of any given area of mathematics.

This all seems right to me, but I think this doesn’t obviate the need for some ontological, metaphysical, and epistemological foundations for mathematics. While Benacerraf has shown (in “What Numbers Could Not Be”) that set theory at least hasn’t yet answered the metaphysical questions about natural numbers, I don’t think this means that no solution is necessary. In fact, Benacerraf is often taken as the start of an argument either for structuralism about mathematical entities (which is still somewhat mysterious, as far as I can tell) or an argument that numbers are their own sort of entity apart from sets. Similarly, though the axioms themselves don’t need to be certain, it seems that they can acquire inductive justification from their explanation of various observed regularities throughout science and mathematics, and can confirm these higher level results by showing that they are part of a unified theory.

Of course, set theory as currently practiced may not end up being the right way to solve these problems. (I’m sympathetic to a Fieldian sort of fictionalism myself.) But it can still be socially useful for mathematics.