15 years later, Microsoft morged my diagram

nvie.com

909 points by cheeaun 13 hours ago


m12k - 10 hours ago

Regarding the original git-flow model: I've never had anyone able to explain to me why it's worth the hassle to do all the integration work on the "develop" branch, while relegating the master/main branch to just being a place to park the tag from the latest release. Why not just use the master/main branch for integration instead of the develop branch - like the git gods intended - and then not have the develop branch at all? If your goal is to have an easy answer to "what's the latest release?", you have the tags for that in any case. Or if you really want to have a whole branch just to double-solve that one use-case, why not make a "release-tags" branch for that, instead of demoting the master/main branch to that role, when it already has a widely used, different meaning?

It's a pity that such a weird artifact/choice has made its way into a branching model that has become so widely implemented. Especially when the rest of it is so sensible - the whole "feature-branch, release-branch, hotfix" flow is IMO exactly right for versioned software where you must support multiple released versions of it in the wild (and probably the reason why it's become so popular). I just wish it didn't have that one weirdness marring it.

alex_suzuki - 11 minutes ago

For context:

> At Microsoft, we're working to add articles to Microsoft Learn that contain AI-generated content. Over time, more articles will feature AI-generated text and code samples.

From: https://learn.microsoft.com/en-us/principles-for-ai-generate...

<vomit emoji here>

Animats - 12 hours ago

This is so out of hand.

There's this. There's that video from Los Alamos discussed yesterday on HN, the one with a fake shot of some AI generated machinery. The image was purchased from Alamy Stock Photo. I recently saw a fake documentary about the famous GG-1 locomotive; the video had AI-generated images that looked wrong, despite GG-1 pictures being widely available. YouTube is creating fake images as thumbnails for videos now, and for industrial subjects they're not even close to the right thing. There's a glut of how-to videos with AI-generated voice giving totally wrong advice.

Then newer LLM training sets will pick up this stuff.

"The memes will continue" - White House press secretary after posting an altered shot of someone crying.

rmunn - 12 hours ago

Similar story. I'm American but work and live outside the US, so I don't know how likely this would be if I had ordered from Amazon. But I ordered a rug for my sons' room from this country's equivalent to Amazon (that is, the most popular order-online-and-we-ship-to-you storefront in this country), and instead of what I ordered (a rug with an image showing the planets, with labels in English) I got an obviously AI-generated copy of the image, whose letters were often mangled (MARS looked like MɅPS, for example). Thankfully the storefront allowed me to return it for a refund, I ordered from a different seller on the second try, and this time I received a rug that precisely matched the image on the storefront. But yes, there are unscrupulous merchants who are using AI to sloppily copy other people's work.

nippoo - 13 hours ago

They've taken it down now and replaced with an arguably even less helpful diagram, but the original is archived: https://archive.is/twft6

jezzamon - 13 hours ago

"continvoucly morged" is such a perfect phrase to describe what happened, it's poetic

hansmayer - 11 hours ago

This is hilarious actually. I am starting to lean into "AI-dangerous" camp, but not because the chatbot will ever become sentient. Its precisely because of increasingly widespread adoption of un-reliable tools by the incompetent but self-confident Office Worker (R).

anonymous908213 - 12 hours ago

Microsoft employee (VP of something or other, for whatever Microsoft uses "VP" to mean) doing damage control on Bluesky: https://bsky.app/profile/scott.hanselman.com/post/3mez4yxty2...

> looks like a vendor, and we have a group now doing a post-mortem trying to figure out how it happened. It'll be removed ASAFP

> Understood. Not trying to sweep under rugs, but I also want to point out that everything is moving very fast right now and there’s 300,000 people that work here, so there’s probably be a bunch of dumb stuff happening. There’s also probably a bunch of dumb stuff happening at other companies

> Sometimes it’s a big systemic problem and sometimes it’s just one person who screwed up

This excuse is hollow to me. In an organization of this size, it takes multiple people screwing up for a failure to reach the public, or at least it should. In either case -- no review process, or a failed review process -- the failure is definitionally systemic. If a single person can on their own whim publish not only plagiarised material, but material that is so obviously defective at a single glance that it should never see the light of day, that is in itself a failure of the system.

tombert - 10 hours ago

Is there a single thing that Microsoft doesn’t half-ass? Even if you wanted to AI generate a graph, how hard is it to go into Paint or something and fix the test?

I have been having oodles of headaches dealing with exFAT not being journaled and having to engineer around it. It’s annoying because exFAT is basically the only filesystem used on SD cards since it’s basically the only filesystem that’s compatible with everything.

It feels like everything Microsoft does is like that though; superficially fine until you get into the details of it and it’s actually broken, but you have to put up with it because it’s used everywhere.

cwal37 - 13 hours ago

LinkedIn is also a great example of this stuff at the moment. Every day I see posts where someone clearly took a slide or a diagram from somewhere, then had ChatGPT "make it better" and write text for them to post along with it. Words get mangled, charts no longer make sense, but these people clearly aren't reading anything they're posting.

It's not like LinkedIn was great before, but the business-influencer incentives there seem to have really juiced nonsense content that all feels gratingly similar. Probably doesn't help that I work in energy which in this moment has attracted a tremendous number of hangers-on looking for a hit from the data center money funnel.

etyhhgfff - 36 minutes ago

> This isn't a case ... It's the opposite of that. It's taking something that ...

"Its not this its that" is the new em-dash.

andai - 5 hours ago

https://en.wiktionary.org/wiki/morg

Morg doesn't seem to be a word in English (though it is in Irish!), but it sounds like it should be.

This is one aspect of AI I will miss, if we ever figure out how to make it go away. The delightful chaos. It invented a word here, without even meaning to.

For example, I vibe coded a QWOP clone the other day, and instead of working human legs, it gave me helicopter legs. You can't walk, but if you mash the keyboard, your legs function as a helicopter and you can fly through the sky.

That obviously wasn't intentional! But it was wonderful. I fear that in a few years, AI will be good enough to give me legs that don't fly like a helicopter. I think we will have lost something special at that point.

When I program manually, I am very good at programming bugs. If I'm trying to make something reliable, that's terrible. But if I'm trying to make a computer do something nobody even realized it can do... making it do things you weren't expecting is the only reliable way to do that.

So I've been working on a way to reintroduce bugs mechanically, by mutating the AST. The fundamental idea is sound -- most of my bugs come from "stuff I obviously meant to type, but didn't" -- but it needs a bit more work. Right now it just produces nonsense even I wouldn't come up with :)

I currently have "mess up the file". The next 2 phases would be "in a way so that it still compiles", and "in a way so that it doesn't (immediately) crash at runtime", (since the whole point is "it still runs, but it does something weird!"). More research needed :)

munificent - 5 hours ago

> What's dispiriting is the (lack of) process and care: take someone's carefully crafted work, run it through a machine to wash off the fingerprints, and ship it as your own.

We should start calling this "copyright laundering".

adzm - 12 hours ago

> Till next 'tim'

It took me a few times to see the morged version actually says tiന്ന

AshleysBrain - 10 hours ago

Is this not a good example of how generative AI does copyright laundering? Suppose the image was AI generated and it did a bad copy of the source image that was in the training data, which seems likely with such a widely disseminated image. When using generative AI to produce anything else, how do you know it's not just doing a bad quality copy-paste of someone else's work? Are you going to scour the internet for the source? Will the AI tell you? What if code generation is copy-pasting GPL-licensed code in to your proprietary codebase? The likelihood of this, the lack of a way to easily know it's happening, and the risks it causes, seems to me to be being overlooked amidst all the AI hype. And generative AI is a lot less impressive if it often works as a bad quality copy paste tool rather than the galaxy brain intelligence some like to portray it as.

xxr - 11 hours ago

When I read the title, I thought "morg" was one of those goofy tech words that I had missed but whose meaning was still pretty clear in context (like a portmanteau of "Microsoft" and "borged," the latter of which I've never heard as a verb but still works). I guess it's a goofy tech word now.

Brian_K_White - 13 hours ago

Please let morged become a thing.

rambambram - 9 hours ago

Microsoft is only doing something about this now because there's enough evidence for a lawsuit. I don't know about the US, but the author seems to be from The Netherlands. Correct me if I'm wrong (and I don't know the exact legal name for it now), but there's something like a right to not get 'distortion or mutilation of intellectual property'.

Microsoft just spits in this creator's face by mutilating his creation in a bad way.

nicbou - 9 hours ago

> Is there even a goal here beyond "generating content"?

This is the part that hurts. It's all so pointless, so perfunctory. A web of incentives run amok. Systems too slick to stop moving. Is this what living inside the paperclip maximizer feels like?

Words we didn't write, thoughts we didn't have, for engagement, for a media presence, for an audience you can peddle yourself to when your bullshit job gets automated. All of that technology, all those resources, and we use it to drown humanity in noise.

aftergibson - 11 hours ago

Archive.org shows this went live last September: https://web.archive.org/web/20250108142456/https://learn.mic...

It took ~5 months for anyone to notice and fix something that is obviously wrong at a glance.

How many people saw that page, skimmed it, and thought “good enough”? That feels like a pretty honest reflection of the state of knowledge work right now. Everyone is running at a velocity where quality, craft and care are optional luxuries. Authors don’t have time to write properly, reviewers don’t have time to review properly, and readers don’t have time to read properly.

So we end up shipping documentation that nobody really reads and nobody really owns. The process says “published”, so it’s done.

AI didn’t create this, it just dramatically lowers the cost of producing text and images that look plausible enough to pass a quick skim. If anything it makes the underlying problem worse: more content, less attention, less understanding.

It was already possible to cargo-cult GitFlow by copying the diagram without reading the context. Now we’re cargo-culting diagrams that were generated without understanding in the first place.

If the reality is that we’re too busy to write, review, or read properly, what is the actual function of this documentation beyond being checkbox output?

duxup - 3 hours ago

There's a lot of "bad look" things that aren't a big deal.

But man this one indicates such a horrible look / lack of effort (like none) from Microsoft.

Not that Microsoft is short on bad looks, but this really seems like one of those painfully symbolic ones.

crossroadsguy - 11 hours ago

Something tangential..

> people started tagging me on Bluesky and Hacker News

Never knew tagging was a thing on Hacker News. Is it a special feature for crème de crème users?

jacquesm - 9 hours ago

I wonder how far the balance will have to tip before the general public realizes the danger. Humanity's combined culture, for better or worse, up to 2021 or so was captured in a very large but still ultimately finite stream of bits. And now we're diluting those bits at an ever greater speed. The tipping point where there are more generated than handcrafted bits is rapidly approaching and obviously it won't stop there. A few more years and the genuine article is going to be a rarity.

QuiCasseRien - 6 hours ago

This is the type of diagram that horrified me.

It's a very very hard and time consuming task for dev to maintain hotfix for previous releases !

Yeah, easier for users, they don't have to care about breaking changes or migration guide. They just blindly update to the nearest minor.

But as the time goes on, the code for dev ends up being a complete mess of git branches and backports. Dev finally forgot some patches and the software contains a major security hole.

Dev ends by being exhausted, frustrated by its project and roasted by its users.

=> What I do : do not maintain any previous release but provide a strong migration guide and list all breaking changes !

users just have to follow updates or use another software.

I'm happy with it, my project has no debt code and more clean code.

chromehearts - 12 hours ago

Billions must morge

bob1029 - 10 hours ago

This is why we don't use diffusion style models for diagrams or anything containing detailed typography.

An LLM driving mermaid with text tokens will produce infinitely more accurate diagrams than something operating in raster space.

A lot of the hate being generated seems due to really poor application of the technology. Not evil intent or incapable technology. Bad engineering. Not understanding when to use png vs jpeg. That kind of thing.

ezst - 12 hours ago

Waiting for the LLM evangelists to tell us that their box of weights of choice did that on purpose to create engagement as a sentient entity understanding the nature of tech marketing, or that OP should try again with quatuor 4.9-extended (that really ships AGI with the $5k monthly subscription addon) because it refactored their pet project last week into a compilable state, after only boiling 3 oceans.

zahlman - 12 hours ago

I'm glad I actually checked TFA before asking here if "morging" referred to some actual technical concept I hadn't previously heard of.

imglorp - 3 hours ago

I find it interesting that current AI, as stellar as it is for language and even looking at writing in images, falls over hard when generating writing in images.

bulbar - 11 hours ago

Good example of the fact that LLMs, as its core, are lossy compression algorithm that are able to fill in the gaps very cleverly.

1970-01-01 - 4 hours ago

Never forget V is for carrot. https://i.redd.it/gj6tf34vkzcg1.png

asddubs - 10 hours ago

Here's the page with the diagram on it:

https://web.archive.org/web/20250908220945/https://learn.mic...

eurekin - 10 hours ago

It's funny how big of an impact individual developers can have with such seemingly simple publications. At the time of the article with that diagram release, I was changing jobs and I distinctly remember, that the diagram was extensively discussed and compared to company standards, at both the old and the new place.

KronisLV - 10 hours ago

> take someone's carefully crafted work, run it through a machine to wash off the fingerprints, and ship it as your own.

I don’t even care about AI or not here. That’s like copying someone’s work, badly, and either not understanding or not giving a shit that it’s wrong? I’m not sure which of those two is worse.

alex_suzuki - 13 hours ago

From TFA:

> the diagram was both well-known enough and obviously AI-slop-y enough that it was easy to spot as plagiarism. But we all know there will just be more and more content like this that isn't so well-known or soon will get mutated or disguised in more advanced ways that this plagiarism no longer will be recognizable as such.

Most content will be less known and the ensloppified version more obfuscated... the author is lucky to have such an obvious association. Curious to see if MSFT will react in any meaningful way to this.

Edit: typo

neonihil - 5 hours ago

> The AI rip-off was not just ugly. It was careless, blatantly amateuristic, and lacking any ambition, to put it gently. Microsoft unworthy.

I'd argue that this statement is perfectly true when the word "unworthy" is removed.

My_Name - 7 hours ago

Of course, if you use AI, it's very hard to know the sources used for training that went into the output you just got.

AndroTux - 12 hours ago

“It was careless, blatantly amateuristic, and lacking any ambition, to put it gently. Microsoft unworthy.”

Seems to be perfectly on brand for Microsoft, I don’t see the issue.

nashashmi - 5 hours ago

I remember doing these kinds of knock offs of diagrams all the time in elementary school and middle school. I wonder if I did this when I was a kid, would the author feel just as triggered?

At some point, AI transformations of our work is just good enough but not excellent enough. And that is where the creators’ value lies.

zkmon - 12 hours ago

That old beatiful git branching model got printed into the minds of many. Any other visual is not going to replace it. The flood of 'plastic' incarnations of everything is abominable. Escape to jungles!!

jron - 12 hours ago

Morged > Oneshotted

bayindirh - 13 hours ago

Sorry but, isn't this textbook Microsoft? Aside being more blatant, careless and on the nose; what's different than past Microsoft?

These people distilled the knowledge of AppGet's developer to create the same thing from scratch and "Thank(!)" him for being that naive.

Edit: Yes, after experiencing Microsoft for 20+ odd years, I don't trust them.

beeflet - 11 hours ago

Developer BRUTALLY FRAME-MORGED by Microsoft AI

kgeist - 11 hours ago

>What's dispiriting is the (lack of) process and care: take someone's carefully crafted work, run it through a machine to wash off the fingerprints, and ship it as your own.

"Don't attribute to malice what can be adequately explained by stupidity". I bet someone just typed into ChatGPT/Copilot, "generate a Git flow diagram," and it searched the web, found your image, and decided to recreate it by using as a reference (there's probably something in the reasoning traces like, "I found a relevant image, but the user specifically asked me to generate one, so I'll create my own version now.") The person creating the documentation didn't bother to check...

Or maybe the image was already in the weights.

kshri24 - 11 hours ago

I can already tell this is probably some AI Microslop fuck up without even clicking on the article.

EDIT: Worse than I thought! Who in their right mind uses AI to generate technical diagrams? SMDH!

ifh-hn - 7 hours ago

I had to look up what "morged" meant, it's either morph and merge or a youtuber. I'm going with the first.

I can't find a link to the learn page so can only see what's on the article. Is this a real big deal? Genuine question, driveby downvote if you must.

Even if this was a product of AI surely it's just a case of fessing up and citing the source? Yeah it doesn't look good for MS but it's hardly the end of the world considering how much shit AI has ripped off... I might be missing something.

usefulposter - 13 hours ago

Hey, it's just like the Gas Town diagrams.

https://news.ycombinator.com/item?id=46746045

dotdi - 12 hours ago

I guess this image generation feature should never have been continvoucly morged back into their slop machine

4ggr0 - 4 hours ago

> It was careless, blatantly amateuristic, and lacking any ambition, to put it gently. Microsoft unworthy.

well, what should i say...

zombot - 9 hours ago

Continvoucly! Well morged.

ftchd - 9 hours ago

is this the HN version of "mogged my frame"?

bitwize - 13 hours ago

I love it when the LLM said "it's morgin' time" and proceeded to morg all over the place.

whirlwin - 12 hours ago

The new Head of Quality in Microsoft has not started working there yet, so it's business as usual at MS... And now with AI slop on top

Ref: https://www.reddit.com/r/technology/comments/1r1tphx/microso...

bschwindHN - 11 hours ago

> The AI rip-off was not just ugly. It was careless, blatantly amateuristic, and lacking any ambition, to put it gently.

That pretty much describes Microsoft and all they do. Money can't buy taste.

He was right:

https://www.youtube.com/watch?v=3KdlJlHAAbQ

isoprophlex - 12 hours ago

> The AI rip-off was not just ugly. It was careless, blatantly amateuristic, and lacking any ambition, to put it gently. Microsoft unworthy.

lmao where has the author been?! this has been the quintessential Microsoft experience since windows 7, or maybe even XP...

- 10 hours ago
[deleted]
shaky-carrousel - 10 hours ago

> The AI rip-off was not just ugly. It was careless, blatantly amateuristic, and lacking any ambition, to put it gently. Microsoft unworthy.

LOL, I disagree. It's very on brand for Microslop.

WesolyKubeczek - 12 hours ago

I propose to adopt the word „morge”, a verb meaning „use an LLM to generate content that badly but recognizably plagiarizes some other known/famous work”.

A noun describing such piece of slop could be „morgery”.

larodi - 11 hours ago

Everything you publish now on will be stolen and reused one way or another.

zephen - 12 hours ago

On the one hand, I feel for people who have their creations ripped off.

On the other hand, it makes sense for Microsoft to rip this off, as part of the continuing enshittification of, well, everything.

Having been subjected to GitFlow at a previous employer, after having already done git for years and version control for decades, I can say that GitFlow is... not good.

And, I'm not the only one who feels this way.

https://news.ycombinator.com/item?id=9744059

ali-aljufairi - 10 hours ago

[dead]

poojagill - 12 hours ago

[dead]

VerifiedReports - 12 hours ago

[flagged]

marssaxman - 12 hours ago

It seems to me rather less likely that someone at Microsoft knowingly and deliberately took his specific diagram and "ran it through an AI image generator" than that someone asked an AI image generator to produce a diagram with a similar concept, and it responded with a chunk of mostly-memorized data, which the operator believed to be a novel creation. How many such diagrams were there likely to have been, in the training set? Is overfitting really so unlikely?

The author of the Microsoft article most likely failed to credit or link back to his original diagram because they had no idea it existed.

pwndByDeath - 13 hours ago

https://www.urbandictionary.com/define.php?term=Morged I got nothing...

amdivia - 11 hours ago

I'm failing to understand the criticism here

Is it about the haphazardous deployment of AI generated content without revising/proof reading the output?

Or is it about using some graphs without attributing their authors?

if it's the latter (even if partially) then I have to disagree with that angle. A very widespread model isn't owned by anyone surely, I don't have to reference newton everytime I write an article on gravity no? but maybe I'm misunderstanding the angle the author is coming from

(Sidenote: if it was meant in a lightheaded way then I can see it making sense)

yokoprime - 12 hours ago

A somewhat contrarian perspective is that this diagram is so simple and widely used and has been reproduced (ie redrawn) so many times that is very easy to assume this does not have a single origin and that its public domain.