Nearly all binary searches and mergesorts are broken (2006)
research.google161 points by thunderbong 2 days ago
161 points by thunderbong 2 days ago
(2006)
Past discussions:
https://news.ycombinator.com/item?id=33492122
https://news.ycombinator.com/item?id=14906429
https://news.ycombinator.com/item?id=12147703
https://news.ycombinator.com/item?id=9857392
https://news.ycombinator.com/item?id=6799336
> (2006)
In the very least,this should feature in the title. In fact, previous submissions from over a decade ago also feature year in the title.
Otherwise it conveys the idea that this is some major epiphany.
Considering how often this happens, I'm surprised HN doesn't have a feature to flag and index a historically resubmitted article, whether externally by users or internally by the admin team.
Then it could have a bot create links to past submissions like the OP did and use the best arrived at title for resubmissions.
It does have the ability to catch multiple submissions of new articles but that probably has a time window of a day or so.
One problem would be that you can't just check the URL, you'd have to check the content. Not only are there many URLs that could point to the same content, but content on a page can obviously change.
I suppose you could compare against wayback, and not sure I'd try to compare with an LLM or RAG.
Also, I didn't know about this specific bug but I spotted it almost immediately while reading the code. This is not because I'm some kind of 10x programmer but because Integer.MAX_VALUE bugs while processing arrays are actually fairly common in java programs in 2025, I fixed one a few weeks back and it's something I look for in code reviews.
I guess it would have been surprising in 2006?
In 2006, if you tried to create an array with 2^30 elements you would just get an OutOfMemoryError almost anywhere.
AMD's 64b CPU were released in 2003, Intel followed up in 2004, and in 2006 launched their first 64b laptop chip (with Core 2). By then the ability to reasonably allocate more than 1GB in one shot was becoming quite widespread (the mid 2007 MBPs could be upgraded to 6GB RAM).
And obviously Google would have been dealing with servers, where it'd be even older as they'd probably have been using PAE and each process would be able to allocate and address a full 4GB.
Not sure if you're a troll... but for general info, in 2006 most desktop computers (laptops were still not so common) had something like 100MB of RAM if you were lucky. Maybe Google already had some huge machines with 1GB+ but that was not at all a common thing outside supercomputers.
I think you’re off by a decade or so.
Common consumer laptops had 2-4 GB of RAM in 2006.
https://everymac.com/systems/apple/macbook_pro/specs/macbook...
Just like today you can buy machines with 128GB of RAM... but that doesn't mean that's what people were using... a lot of people buy machines with 4GB today (just checked the most popular website in my country, lots of the offers only have 4GB: https://www.elgiganten.se/datorer-kontor/datorer/laptop).
I remember pretty clearly that if you had anywhere above 512MB of RAM in 2006 you had very much top-of-the-line.
> I remember pretty clearly that if you had anywhere above 512MB of RAM in 2006 you had very much top-of-the-line.
That’s a different claim than your original statement of having 100MB max.