The Principles of Diffusion Models
arxiv.org200 points by Anon84 21 hours ago
200 points by Anon84 21 hours ago
If you're more into videos, be sure to check out Stefano Ermon's CS236 Deep Generative Models [1]. All lectures are available on YouTube [2].
[1] https://deepgenerativemodels.github.io/
[2] https://m.youtube.com/playlist?list=PLoROMvodv4rPOWA-omMM6ST...
I wish Stanford kept offering CS236 but they haven't run it for two years already :(
Is there something equivalent in scope and comprehensiveness for transformers?
hn question: how is this not a dupe of my days old submission (https://news.ycombinator.com/item?id=45743810) ?
It is, but dupes are allowed in some cases:
“Are reposts ok?
If a story has not had significant attention in the last year or so, a small number of reposts is ok. Otherwise we bury reposts as duplicates.”
https://news.ycombinator.com/newsfaq.html
Also, from the guidelines: “Please don't post on HN to ask or tell us something. Send it to hn@ycombinator.com.”
in other words - "it is lol, also go pound sand"
What's the problem? Someone submitted it for people to read but it didn't catch on, now it's resubmitted and people can read it after all. Everyone happy. Don't be so attached to imaginary internet points.
CTRL-F: "Fokker-Planck"
> 97 matches
Ok I'll read it :)
why am I only getting 26 matches? where's the threshold then? :D
i m scared by the maths
Reading this reinforces that a lot of what makes up current "AI" is brute forcing and not actually intelligent or thoughtful. Although I suppose our meat-minds could also be brute-forcing everything throughout our entire lives, and consciousness is like a chat prompt sitting on top of the machinery of the mind. But artificial intelligence will always be just as soulless and unfulfilling as artificial flavors.
Guessing you’re a physicist based on the name. You don’t think automatically doing RG flow in reverse has beauty to it?
There’s a lot of “force” in statistics, but that force relies on pretty deep structures and choices.
Intelligence is the manifold that these brute-force algorithms learn.
Of course we don’t brute-force this in our lifetime. Evolution encoded the coarse structure of the manifold over billions of years. And then encoded a hyper-compressed meta-learning algorithm into primates across millions of years.
470 pages?!?!?!? FML! :-D