Faster Than Dijkstra?

systemsapproach.org

83 points by drbruced 4 days ago


shoo - 42 minutes ago

The underlying argument this article seems to be making is that an appropriate algorithm for any given application isn't always the one with the most efficient asymptotic performance for sufficiently large n -- for a given application (in this case routing), we have data on typical values of n that appear in reality and we can choose an algorithm that offers good enough (or optimal) performance for n in that constrained range, as well as potentially having other benefits such being simpler to implement correctly in a short amount of engineering time.

This argument is very much in line with Mike Acton's data-driven design philosophy [1] -- understand the actual specific problem you need to solve, not the abstract general problem. Understand the statistical distribution of actual problem instances, they'll have parameters in some range. Understand the hardware you're building writing software for, understand its finite capacity & capability.

It's common that new algorithms or data-structures with superior asymptotic efficiency are less performant for smaller problem sizes vs simpler alternatives. As always, it depends on the specifics of any given application.

[1] see Mike's CppCon 2014 talk "Data-Oriented Design and C++" https://www.youtube.com/watch?v=rX0ItVEVjHc

vprcic - 4 hours ago

Each time a discussion about sorting starts, I'm reminded of a "lively debate" I had with my boss/friend about the most optimal sorting approach. He claimed it's O(n) pointing to counting sort as an example. This didn't sit well with me. A sorting algorithm, I insisted, should be defined something like "a function taking an unsorted array of elements and returning a sorted one". But it seems there is no agreed upon definition and you can have something like counting sort where you get an array in O(n) but still need O(m) to get the index of a specific element. I argued that then the best sorting algorithm is the do-nothing algorithm. It returns the initial array in O(1), but needd O(m log m) to get you the index of an element (it uses merge sort to do it).

alias_neo - 6 hours ago

Deja Vu.

I read this article a few days ago I'm sure, word-for-word, but it wasn't on this site in OP? It stood out because when it mentioned textbooks and said "including ours" I looked at the site and thought to myself "they do textbooks?".

fn-mote - 2 hours ago

Should have a more conspicuous link to the Quanta article that it references if not the original. (It’s there, just not on the top.)

https://www.quantamagazine.org/new-method-is-the-fastest-way...

https://arxiv.org/abs/2504.17033

shiandow - 5 hours ago

Interestingly sorting is O(N) for a surprisingly large class of datatypes. Anything that behaves well with lexicographic sorting really.

Supposing one uses a 'trie' as a priority queue, the inserts and pops are effectively constant.

NooneAtAll3 - 4 hours ago

am I missing something?

this was a lot of words that sum up to "I heard that new algorithm exists but spent zero effort actually evaluating it"

jason_s - 6 hours ago

Intriguing article. Sometimes practical issues override theoretical ones, and it would be interesting to see which one dominates in networking.

(side note: does anyone else get thrown off by the Epilogue font? It looks very wide in some cases and very narrow in others... makes me want to override it with Stylus if my employer hadn't blocked browser extensions for security reasons, which raises the question of why I am even reading the article on this computer....)

K0IN - 4 hours ago

I can only recommend (for all Germans here) this video from "dorfuchs":

https://youtu.be/3ge-AywiFxs?si=TbcRsBNkzGhpOxQ4&t=842

(timestamped=) He shows a derivation that at best, a sorting algorithm can do is O(n log(n)) for n real positive numbers.

qsort - 6 hours ago

I struggle to see the point. The paper in question doesn't claim to be practically faster or to want to "replace" Dijkstra, they are just saying "we got a better big-O" and I don't see any reason to doubt they're wrong about that.

It's actually common for algorithms with a lower asymptotic complexity to be worse in practice, a classic example is matrix multiplication.

Also please, please, can we stop with the "eww, math" reactions?

> The new approach claims order (m log^(2/3) n) which is clearly going to be less for large enough n. (I had to take a refresher course on log notation before I could even write that sentence with any confidence.)

I'm sure the author is just exaggerating, he's clearly very competent, but it's a sentence with the vibes of "I can't do 7x8 without a calculator."