28M Hacker News comments as vector embedding search dataset

clickhouse.com

319 points by walterbell 8 hours ago


minimaxir - 7 hours ago

Don't use all-MiniLM-L6-v2 for new vector embeddings datasets.

Yes, it's the open-weights embedding model used in all the tutorials and it was the most pragmatic model to use in sentence-transformers when vector stores were in their infancy, but it's old and does not implement the newest advances in architectures and data training pipelines, and it has a low context length of 512 when embedding models can do 2k+ with even more efficient tokenizers.

For open-weights, I would recommend EmbeddingGemma (https://huggingface.co/google/embeddinggemma-300m) instead which has incredible benchmarks and a 2k context window: although it's larger/slower to encode, the payoff is worth it. For a compromise, bge-base-en-v1.5 (https://huggingface.co/BAAI/bge-base-en-v1.5) or nomic-embed-text-v1.5 (https://huggingface.co/nomic-ai/nomic-embed-text-v1.5) are also good.

afiodorov - 8 hours ago

I've been embedding all HN comments since 2023 from BigQuery and hosting at https://hn.fiodorov.es

Source is at https://github.com/afiodorov/hn-search

isodev - 7 hours ago

Maybe I’m reading this wrong, but commercial use of comments is prohibited by the HN Privacy and data Policy. So is creating derivative works (so technically a vector representation)

delichon - 7 hours ago

I think it would be useful to add a right-click menu option to HN content, like "similar sentences", which displays a list of links to them. I wonder if it would tell me that this suggestion has been made before.

SchwKatze - 7 hours ago

I know it's unrelated but does anyone knows a good paper comparing vector searches vs "normal" full text search? Sometimes I ask myself of the squeeze worth the juice

catapart - 7 hours ago

Am I misunderstanding what a parquet file is, or are all of the HN posts along with the embedding metadata a total of 55GB?

zkmon - 7 hours ago

I don't know how to feel about this. Is the only purpose of the comments here is to train some commercial model? I have a feeling that, this might affect my involvement here going forward.

j4coh - 8 hours ago

Oh to have had a delete account/comments option.

ProofHouse - 7 hours ago

Scratches off one of my todos,

rashkov - 4 hours ago

Is there an affordable service for doing something like this?

cdblades - 5 hours ago

Can I submit a request somewhere to have my data removed?

dangoodmanUT - 6 hours ago

Why all-MiniLM-L6-v2? This is so old and terribly behind the new models...

doctorslimm - 5 hours ago

why is this not on huggingface as a dataset yet? is anyone poutine this on hugginggface?

Kuraj - 3 hours ago

I can't help but feel a bit violated by this.

dmezzetti - 5 hours ago

Fun project. I'm sure it will get a lot of interest here.

For those into vector storage in general, one thing that has interested me lately is the idea of storing vectors as GGUF files and bring the familiar llama.cpp style quants to it (i.e. Q4_K, MXFP4 etc). An example of this is below.

https://gist.github.com/davidmezzetti/ca31dff155d2450ea1b516...

SilverElfin - 6 hours ago

Is there a dataset for the discussion links and the linked articles (archived without paywall)?

baalimago - 7 hours ago

Finetune LLM to post_score -> high quality slop generator

Joshua-Peter - 7 hours ago

[dead]

slurrpurr - 5 hours ago

The most smug AI ever will be trained on this

John-Tony - 7 hours ago

[dead]

John-Tony12 - 7 hours ago

[dead]

doctorslimm - 5 hours ago

lmao this is gold

GeoAtreides - 7 hours ago

I don't remember licensing my HN comments for 3rd party processing.