Reverse math shows why hard problems are hard

quantamagazine.org

162 points by gsf_emergency_6 a day ago


shevy-java - 12 hours ago

I recently had, for various reasons, improve my math skills.

I was surprised at how difficult I found math. Now, I was never really great at math; logic and calculation in the head I could do fairly well (above average), but just foundational knowledge was hard and mathematical theory even harder. But now I even had trouble with integration and differentiation and even with understanding a problem to put it down into a formula. I am far from being the youngest anymore, but I was surprised at how shockingly bad I have become in the last some +25 years. So I decided to change this in the coming months. I think in a way computers actually made our brains worse; many problems can be auto-solved (python numpy, sympy etc...) and the computers work better than hand-held calculators, but math is actually surprisingly difficult without a computer. (Here I also include algorithms by the way, or rather, the theory behind algorithms. And of course I also forgot a lot of the mathematical notation - somehow programming is a lot easier than higher math.)

emil-lp - 19 hours ago

Whenever the pigeon-hole principle is name dropped, we should also drop Dijkstra's commentary The undeserved status of the pigeon-hole principle.

https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD109...

degamad - 21 hours ago

Specifically, reverse math (a subset of metamathematics which looks at swapping axioms and theorems) allows us to show that some hard problems are equivalent to each other.

EDIT: I think this line is the most telling:

> But he cautioned that the reverse mathematics approach may be most useful for revealing new connections between theorems that researchers have already proved. "It doesn’t tell us much, as far as we can say, about the complexity of statements which we do not know how to prove."

So, at this point, it helps us understand more about problems we already understand a little about, but nothing yet about new problems.

kazinator - 6 hours ago

The pigeonhole principle informs us that if we have more pigeons than holes, some holes have two or more pigeons. But also that if we have fewer pigeons than holes, some holes will necessarily be empty.

Given two bit strings of length n, if we compare fewer than n pairs, we cannot tell whether they are equal. The strings being equal is the proposition a_0 = b_0 ^ a_1 = b_1 ^ ... ^ a_n-1 = b_n-1. You cannot simplify this formula such that any a_i or b_i primary is taken away.

dvh - 15 hours ago

Backwards game of life: https://m.youtube.com/watch?v=g8pjrVbdafY

Bankq - 21 hours ago

The approach reminds me of NP-Completeness (Computational harness vs mathematical-proving hardness). Am I over-simplifying?

boie0025 - 18 hours ago

I'm pretty far away from learning about these things in school, but this made me wonder on the connection between the mentioned communication complexity lower bound and special relativity limits on how fast information can travel.

zkmon - 17 hours ago

Overall complexity (work required) is a conserved quantity. You can move it around and claim that a new algorithm has reduced the complexity, but in essence it has shifted the complexity elsewhere.

Also, whether some problem has polynomial complexity or exponential complexity depends on what you consider as the problem space. Complexity of b^c is polynomial if b is the problem space and exponential if c is the problem space.

Complexity of traveling salesman problem depends on what you consider as the problem space - number of cities or number of connections between cities.

- 15 hours ago
[deleted]
letmetweakit - 18 hours ago

The Travelling Salesman Problem in 1 dimension, on a line, is trivial, I wonder what the connection is between the dimensions and the hardness of problems like this.

empath75 - 8 hours ago

I think the reason this is interesting to mathematicians is that he was working with an axiomatic system that is fairly new, and _in particular_ is thought not to be strong enough to be able to prove the pigeon hole principle. Since he proved that all these other theorems are equivalent to the pigeonhole principle, all of those other theorems are probably also not able to be proven with PV1.

kittikitti - 9 hours ago

I usually don't say this about Quantamagazine, but thank you for covering and informing me about this. The perspective is insightful and after comprehending the concepts a little more, they are not a hyperbole. I'm currently reading through the mentioned paper, "An Introduction to Feasible Mathematics and Bounded Arithmetic for Computer Scientists" by Jiatu Li and I believe that it's greatly elucidating.

If you're reading the paper, I recommend Section 1.3 where it goes over the examples of Binary Search and Dijkstra. The idea that "natural numbers are encoded by binary strings just as other data structures" in the preface is prevalent in their constructions of their proofs. As a computer scientist, this is great because I intuitively recognize that both algorithms and memory consist of only 1's and 0's underneath all the abstractions and code.

This work ties together practical applications and complexity theory to create a new bridge in mathematics that I'm excited about exploring. I'm especially interested in reverse mathematics applied to space complexity.

Here's some additional resources I found on this, Talk by Jiatu Li, joint work with Lijie Chen, Igor Carboni Oliveira Title: Reverse Mathematics of Complexity Lower Bounds https://www.youtube.com/watch?v=g5EqAgDxxE0

YouAreWRONGtoo - 17 hours ago

[dead]