Last fifty years of integer linear programming: Recent practical advances (2024)

inria.hal.science

204 points by teleforce a day ago


aquafox - 20 hours ago

Could someone maybe give a high-level explanation into why commercial ILP solvers (e.g. Gurobi) are that much better than free/open-source ones? Is it because ILP is inherently that difficult to solve (I know it's NP-hard), that the best solvers are just a large ensemble of heuristics for very specific sub-problems and thus no general "good" strategy has made it's way into the public domain?

dcminter - 21 hours ago

I vaguely recall building a resource allocation tool using IBM's "ILOG" mixed integer linear programming library and we realised that if we'd built the tool about 20 years earlier it would have still been running for the same problems we were now solving within 5 minutes.

As I recall it the raw computer power had increased by a factor of around a thousand and the algorithms had improved by about the same, giving us a factor of a million improvement.

Worth pondering when trying to predict the future!

The "resources" in question were diamonds by the way...

tormeh - 18 hours ago

Can anyone share how this is used in practice? Somehow I imagine implementing numerical optimization often fails due to the usual problems with data-driven approaches (trust, bad data, etc.) and ultimately someone important just decides how things are going to be done based on stomach feel.

pradn - 11 hours ago

"... between 1988 and 2004, hardware got 1600 times faster, and LP solvers got 3300 times faster, allowing for a cumulative speed-up factor higher than 5 × 106, and that was already 20 years ago!"

"The authors observed a speedup of 1000 between [the commercial MILP solvers of] 2001 and 2020 (50 due to algorithms, 20 due to faster computers)."

I wonder if we can collect these speedup factors across computing subfields, decomposed by the contribution of algorithmic improvements, and faster computers.

In compilers, there's "Proebsting's Law": compiler advances double computing power every 18 years.

djoldman - a day ago

I've heard Gurobi is fairly expensive. Anyone willing to share pricing details?

FabHK - 21 hours ago

I remember implementing some version of Gomory cutting hyperplanes back in the 1990s in Maple (for learning, not for production.) Looks like the field has progressed...

> if we needed two months of running time to solve an LP in the early 1990s, we would need less than one second today. Recently, Bixby compared the machine-independent performance of two MILP solvers, CPLEX and Gurobi, between 1990 and 2020 and reported speed-ups of almost 4×10^6.

beret4breakfast - 20 hours ago

It feels like there’s a significant lack of “ML/AI” based approaches applied to these kinds of problems. I’ve seen a lot of example of RL/GNN papers that do attempt to solve smaller problems but it always feels like the best option is to just pay for a gurobi license and have at it. I’ve been doing some scheduling optimisation recently (close to job shop scheduling) and while there’s some examples of using RL they just don’t seem to cut it. I’ve resorted to evolutionary algorithms to get reasonable solutions to some big problems. Maybe it’s just always more efficient to using OR type approaches when you can formulate the problem well.

perching_aix - a day ago

title could use [pdf] [2024]

- 17 minutes ago
[deleted]
CyberDildonics - a day ago

Integer linear programming doesn't sound very complex.