Mathematics is hard for mathematicians to understand too
science.org49 points by mmaaz 6 days ago
49 points by mmaaz 6 days ago
I recently came to realize the same things about physics. Even physicist find it hard to develop an intuitive mental picture of how space-time folds or what a photon is.
I think this would be extremely valuable: “We need to focus far more energy on understanding and explaining the basic mental infrastructure of mathematics—with consequently less energy on the most recent results.” I’ve long thought that more of us could devout time to serious maths problems if they were written in a language we all understood.
A little off topic perhaps, but out of curiosity - how many of us here have an interest in recreational mathematics? [https://en.wikipedia.org/wiki/Recreational_mathematics]
Yeah, I don't want to be uncharitable, but I've noticed that a lot of stem fields make heavy use of esoteric language and syntax, and I suspect they do so as a means of gatekeeping.
I understand that some degree of formalism is required to enable the sharing of knowledge amongst people across a variety of languages, but sometimes I'll read a white paper and think "wow, this could be written a LOT more simply".
Statistics is a major culprit of this.
3blue1brown proves your point.
The saying, "What one fool can do, another can," is a motto from Silvanus P. Thompson's book Calculus Made Easy. It suggests that a task someone without great intelligence can accomplish must be relatively simple, implying that anyone can learn to do it if they put in the effort. The phrase is often used to encourage someone, demystify a complex subject, and downplay the difficulty of a task.
Gatekeeping, or self-promotion? You don't get investors/patents/promotions/tenure by making your knowledge or results sound simple and understandable.
Why not both? And that's a good point, there are a LOT of incentives to make things arbitrarily complex in a variety of fields.
> I’ve long thought that more of us could devout time to serious maths problems if they were written in a language we all understood.
That assumes it’s the language that makes it hard to understand serious math problems. That’s partially true (and the reason why mathematicians keep inventing new language), but IMO the complexity of truly understanding large parts of mathematics is intrinsic, not dependent on terminology.
Yes, you can say “A monad is just a monoid in the category of endofunctors” in terms that more people know of, but it would take many pages, and that would make it hard to understand, too.
I was writing a small article about [Set, Set Builder Notation, and Set Comprehension](https://adropincalm.com/blog/set-set-builder-natatio-set-com...) and while i was investigating it surprised me how many different ways are to describe the same thing. Eg: see all the notation of a Set or a Tuple.
One last rant point is that you don't have "the manual" of math in the very same way you would go on your programming language man page and so there is no single source of truth.
Everybody assumes...
I find it strange to compare "math" with one programming language. Mathematics is a huge and diverse field, with many subcommunities and hence also differing notation.
Your rant would be akin to this if the sides are reversed: "It's surprising how many different ways there are to describe the same thing. Eg: see all the notations for dictionaries (hash tables? associative arrays? maps?) or lists (vectors? arrays?).
You don't have "the manual" of programming languages. "
Not the original commenter, but I 100% agree that it's weird we have so many ways to describe dictionaries/hash tables/maps/etc. and lists.
> You don't have "the manual" of programming languages. "
Well, we kinda do when you can say "this python program" the problem with a lot of math is that you can't even tell which manual to look up.
I wrote about overlapping intervals a while ago, and used what I thought was the standard math notation for closed and half-open intervals. From comments, I learned that half-open intervals are written differently in french mathematics: https://lobste.rs/s/cireck/how_check_for_overlapping_interva...
Mathematics is such an old field, older than anything except arguably philosophy, that it's too broad and deep for anyone to really understand everything. Even in graduate school I often took classes in things discovered by Gauss or Euler centuries before. A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old. So, you end up having to spend years specializing and then struggle to find other with the same background.
All of which is compounded by the desire to provide minimal "proofs from the book" and leave out the intuitions behind them.
> A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old.
Do you know the reason for that? The reason is that those problems are open and easy to understand. For the rest of open problems, you need an expert to even understand the problem statement.
actually a lot of minimal proof expose more intuition than older proofs people find at first. I find it usually not extremely enlightening reading the first proofs of results, counterintuitively.
I'll argue for astronomy being the oldest. Minimal knowledge would help pre-humans navigate and keep track of the seasons. Birds are known to navigate by the stars.
> Mathematics is such an old field, older than anything except arguably philosophy
If we are already venturing outside of scientific realm with philosophy, I'm sure fields of literature or politics are older. Especially since philosophy is just a subset of literature.
> I'm sure fields of literature or politics are older.
As far as anybody can tell, mathematics is way older than literature.
The oldest known proper accounting tokens are from 7000ish BCE, and show proper understanding of addition and multiplication.
The people who made the Ishango bone 25k years ago were probably aware of at least rudimentary addition.
The earliest writings are from the 3000s BCE, and are purely administrative. Literature, by definition, appeared later than writing.
> As far as anybody can tell, mathematics is way older than literature.
That depends what you mean by "literature". If you want it to be written down, then it's very recent because writing is very recent.
But it would be normal to consider cultural products to be literature regardless of whether they're written down. Writing is a medium of transmission. You wouldn't study the epic of Gilgamesh because it's written down. You study it to see what the Sumerians thought about the topics it covers, or to see which god some iconography that you found represents, or... anything that it might plausibly tell you. But the fact that it was written down is only the reason you can study it, not the reason you want to.
> That depends what you mean by "literature". If you want it to be written down
That is what literature means: https://en.wiktionary.org/wiki/literature#Noun
Well, then poetry is not literature.
No, the argument is even dumber than that. The person who writes a poem hasn't created any literature.
The person who hears that poem in circulation and records it in his notes has created literature; an anthology is literature but an original work isn't.
Just the other day I was listening to EconTalk on this: https://www.econtalk.org/a-mind-blowing-way-of-looking-at-ma...
I thought we were well past trying to understand mathematics. After all, John von Neumann long ago said "In mathematics we don't understand things. We just get used to them."
Many ideas in math are extremely simple at heart. Some very precise definitions, maybe a clever theorem. The hard part is often: Why is this result important? How does this result generalize things I already knew? What are some concrete examples of this idea? Why are the definitions they way they are, and not something slightly different?
To use an example from functional programming, I could say:
- "A monad is basically a generalization of a parameterized container type that supports flatMap and newFromSingleValue."
- "A monad is a generalized list comprehension."
- Or, famously, "A monad is just a monoid in the category of endofunctors, what's the problem?"
The basic idea, once you get it, is trivial. But the context, the familiarity, the basic examples, and the relationships to other ideas take a while to sink in. And once they do, you ask "That's it?"
So the process of understanding monads usually isn't some sudden flash of insight, because there's barely anything there. It's more a situation where you work with the idea long enough and you see it in a few contexts, and all the connections become familiar.
(I have a long-term project to understand one of the basic things in category theory, "adjoint functors." I can read the definition just fine. But I need to find more examples that relate to things I already care about, and I need to learn why that particular abstraction is a particularly useful one. Someday, I presume I'll look at it and think, "Oh, yeah. That thing. It's why interesting things X, Y and Z are all the same thing under the hood." Everything else in category theory has been useful up until this point, so maybe this will be useful, too?)
It's probably a neurological artefact. When the brain just spent enough time looking at a pattern it can suddenly become obvious. You can go from blind to enlightened without the usual conscious logical effort. It's very odd.
Just because someone said it doesn't mean we all agree with it, fortunately.
You know the meme with the normal distribution where the far right and the far left reach the same conclusion for different reasons, and the ones in the middle have a completely different opinion?
So on the far right you have people on von Neumann who says "In mathematics we don't understand things". On the far left you have people like you who say "me no mats". Then in the middle you have people like me, who say "maths is interesting, let me do something I enjoy".
Of course. I just find it hilarious that someone like von Neumann would say that.
von Neumann liked saying things that he knew would have an effect like "so deep" and "he's so smart". Like when asked how he knew the answer, claiming that he did the sum in his head when undoutedly he knew the closed-form expression.
I have tingling suspicion that you might have missed the joke.
To date I have not met anyone who thought he summed the terms of the infinite series in geometric series term by term. That would take infinite time. Of course he used the expression for the sum of a geometric series.
The joke is that he missed a clever solution that does not require setting up the series, recognising it's in geometric progression and then using the closed form.
The clever solution just finds the time needed for the trains to collide, then multiply that with the birds speed. No series needed.
Ah. I was going by memory, and I had those two as separate stories. I didn't remember that he said "I did the sum" on the trains problem.