My friend Luke Biewald pointed me to an interesting post suggesting that the language of mathematics is actually best viewed as a natural language rather than a formal language. I think some of the points the author makes about math don’t separate natural and formal languages (recursion, self-reference, alphabet and rules for combination, and so on). And I think that since he works in natural language processing, he may be thinking of being natural as a sort of simplicity rather than complexity (as I would think of it). But it’s an interesting point.
Mathematicians basically never write fully formal proofs in the sense that logicians like to talk about. They regularly “abuse notation” and overload symbols in order to simplify their way of speaking. Many of these changes are in fact quite historically contingent – if we hadn’t originally started abbreviating things one way, further developments that way would have looked incomprehensible to a community that had abbreviated things differently.
Given the fact that it is actually used by a relatively large community for certain essential (to those people) types of communication, it has most likely picked up a lot of the irregularities and “irrationalities” that plague natural languages – probably much more so than constructed languages like Esperanto and Klingon. I don’t know what all the relevant differences are between “real” natural languages, Klingon, and math, but they may help reveal something interesting about at least one of these languages.