Warranty Void If Regenerated
nearzero.software488 points by Stwerner a day ago
488 points by Stwerner a day ago
As an experiment I started asking Claude to explain things to me with a fiction story and it ended up being really good, so I started seeing how far I could take it and what it would take to polish it enough to share publicly.
Over the last couple months, I've been building world bibles, writing and visual style guides, and other documents for this project… think the fiction equivalent of all the markdown files we use for agentic development now. After that, this was about two weeks of additional polish work to cut out a lot of fluff and a lot of the LLM-isms. Happy to answer any questions about the process too if that would be interesting to anybody.
I'm trying to sort out my own emotions on this. I did not realize this was AI generated while reading it until I came to the comments here... And I feel genuinely had? Like "oh wow, you got me"... I don't like this feeling. It's certainly the longest thing (I know about) I've taken the time to read that was AI generated. The writing struck me as genuinely good, like something out of The New Yorker. I found the story really enjoyable. I talked to AI basically all day, yet I am genuinely made uneasy by this. Maybe it's because I think your comment throws away a lot of relevant context from OP's submission on HN. He says he spent months on this piece and then some, I think it's safe to assume here that this was well supervised, guided, thoughtful and full of human intent despite the AI-assisted part. In short, I think calling it "AI generated" takes all the human effort that went into these months and the ingenious creativity of OP towards crafting this piece! Anyways, I enjoyed it. :) Reading it, I get the feeling the author worked the story the way Tom Hartmann works those agricultural machines. The AI gave input, but the author was tweaking it with human knowledge and wisdom. It's a major bummer. When I first read the story (a few days ago, maybe?) I thought it was an interesting metaphor that didn't quite line up with the observed details of software development with AI. I assumed the writer was a journalist or author with a non-technical background trying to explore a more "utopian" vision of where trends could go. Without the inferred writer, it's much less interesting to me, except as a reminder that models change and I can't rely on the old tics to spot LLM prose consistently any more. Surely you see it's somewhat unreasonable? As if it was written by the author you disliked, and until you knew of the fact, you quite enjoyed it. Quite honestly, I do that sometimes too -- but I _know_ that it's unreasonable. Can i compare this with fucking inflatable doll (not done this, just extrapolating). Even if senses for your penis are identical, whole experience is totally not the same as doing with another live person. For me, “interestingly wrong” becomes just “wrong” without human thinking behind it. I wasn’t bowled over by the prose, I just thought it was an uncommon take and didn’t twig the signs it was Claude product. hard to form an emotional connection with the emotionless Says parent post, while thinking a stack of rocks that looks a little like a fat raccoon is kind of cute. Humans are designed to form emotional connections with non emotional things. Its sort of our whole deal. Eh, People form emotional connections with inanimate objects, so I'm unsure if that's a good enough argument tbf. A djungelskog is not a threat. AI threatens my livelihood and my humanity. The worst part is I have to use it regardless because I would be uncompetitive without it. What is it about it that makes the story less interesting to you? It's the same story, down to the same delicate details. When AI-slop stops being, well, slop, and just is everything that humans do, but much better, and much more efficient—will we have the same repulsion to it that many of us do now? I find it interesting to ponder. We look at the luddite movement as futile and somewhat fatalistic in a way. I feel like the current attitude towards AI generated art will suffer the same fate—but I'm really not quite sure. What is your understanding of the luddite movement? I ask because I don't believe many are aware that luddites were not anti-technology. It was a labor movement which was targeted at exploitation by factory owners. Their issue was with factories forcing the use of machines to produce inferior products so owners could use cheaper, low skill labor. https://www.vice.com/en/article/luddites-definition-wrong-la... Right, wrong, whatever. The one thing every sane person can agree on is that it's a good thing the Luddites didn't prevail. How much did you pay for the shirt you're wearing now? haha, if you knew me you would realize that I am exactly the wrong person to be asking that specific question. I'd have been ok if things fell more in their direction... I'm not saying "clear win", but a middle ground that had the machines do the things they're best at while letting humans do the quality work. > but a middle ground that had the machines do the things they're best at while letting humans do the quality work. By arguing for letting humans work, particularly quality work, you're not especially finding a middle ground, more adopting the 1811 position of the OG Luddites who were opposed to being put out of work. Yeah, that's a fine sentiment in the general, but let's hear some specifics. I think two sane things. 1) It’s good in the long run that they didn’t prevail at that time. 2) They did actually, in fact, have a point. I mean obviously they had a point? No one wants to lose their job. Everybody wants to lose their jobs. Almost by definition your job is something you do not because you want to, but because you need to earn a living. Even if your job coincides with your hobby, you would prefer not to have your economic welfare tied to it in a way that drives how you engage with it. We are on the verge of making this possible, if a bunch of myopic morons -- people who have never been right about a single long-term trend in history -- can be convinced not to screw it up. Once again showing how little you actual understand about the movement you decry. Stories are particularly troubling because we have the concept of "suspending disbelief" and readers tend to take a leap of faith with longwinded narratives because we assume the author is going somewhere with the story and has written purposefully. When AI can write convincingly enough, it is basically a honeypot for human readers. It looks well-written enough. The concept is interesting and we think it is going somewhere. The point is that AI cannot write anything good by itself, because writing is a form of communication. AI can't communicate, only generate output based on a prompt. At best, it produces an exploded version of a prompt, which is the only seed of interest that carries the whole thing. Somebody had that nugget of an idea which is relevant for today's readers. They told the AI to write it up, with some tone or setting details, then probably edited it a bunch. If we enjoy any part of it, we are enjoying the bits of humanity peeking through the process, not the default text the AI wrote. Right, but in the present case we have exactly what you're describing—a story, almost fully written by AI but with some human cherry-picking in the mix. And readers are finding it a phenomenal story and then wanting to vomit retrospectively in learning about the authorship. It just seems patently obvious to me that this is not where the sentiment is going to stay—it will hit the margin, like the people who decide to not own a cell phone, or those who would rather listen to analog audio; there will be a market for it but it will exist at the margin. Eventually, especially for young people, more and more of what they consume will be AI generated and they won't care because it's indistinguishable from human work. Or, I digress, it will be distinguishable from human work but because it's so much better than anything that a human could have ever created. These AI tools that we have now are as dumb as they will ever be. If we ever reach AGI or superintelligence or whatever—or even if not, even if these tools just advance for 10 more years on their current trajectory—it's easy for me to imagine some scenario where the machines can generate something so perfect to your liking that you just prefer it to anything a human ever would have created, storytelling and all. You can take the general case where AI can just generate a better movie than a team of humans ever could plausibly generate. After all, AI doesn't have any of the physical constraints of a movie studio—the budget, the logistics of traveling from location to location, the catering, the fact that the crew has to sleep, has to coordinate schedules, all that. AI, with some human involvement or not, could just keep iterating on some script on a laptop overnight until its created an optimized version which is more satisfying to humans than any other human made movie ever created. Or in a narrow case it could create the perfect movie for you, given what it knows about you and your interests. All human movies would look inferior. For my kids, who I'm sure are going to grow up in a world where this type of art is embedded everywhere—and where the human version is almost certainly going to be worse—I don't think the desperate cries to see the last scrap of human ingenuity will mean anything. All of these people throwing rocks at Waymos and others boycotting companies for generating ads rather than shooting one with a video studio; it's so obviously helpless, desperate and obviously futile in the face of what's coming. I mourn the future that seems plausible here but I also welcome it as inevitable. The technology is coming, and people are going to have to adapt one way or another. You're talking about content. Only content can be "perfect" as you say. When I'm listening to music, looking at art, seeing a play or a short film I want to feel connection to the humans behind it. AI is by definition missing that connection. That's what makes me retrospectively vomit at AI writings like these. That connection requires that the humans behind it are imperfect, the solo can have one or two sloppy notes, but at least it's genuine interaction. We have seen this same yearning for connection with all the "Don't use LLM to comment, use your true style of writing with its flaws" rules. I'm 100% certain mainstream studios will be producing "perfect" content with AIs just like current mainstream pop stars have 10 ghost writers working on each song to create "perfect" songs. The good stuff will exist in the fringes as always and I'm ok with that as I've already been for years. And the future may not be as settled as you think it is. Leaders try to sell you their vision of the future by saying it is settled and that things are certain, but that is because they want you to believe that, because if you and the masses believe so, it's more certain for the future to settle the way the leaders want. But you can also actively refuse that future and find a different future that's worth believing in yourself. The riff comes first, the people come second. One of the nice things about punk and metal is how anti celebrity in a fundamental way both genres are. In histories of the genres, you will usually find such and such band made such and such invention that led to certain new structures being accessible. Of course the social background of the scenes where it emerged is important too but the history is traced first in terms of the riff. Or aka books like glazing a particular rockstars life history are rare, even though there are some "superstars" in metal and punk. The culture is very "only analog is real, digitals fake shit" but idk in some other ways they seem much closer to having not much difficulty accepting a valid musical work regardless of origin. I don't quite understand what you're getting at with this comment? In metal and punk it's pretty cornerstone of the genre to be authentic, and in metal to value human skills (all the solo parts, fast playing). I've played and listened punk and metal my whole life, but will also enjoy early Lady Gaga, Eminem, Kendrick etc. celebrities because I recognize their authenticity and skills. Sabrina Carpenter and Drake go over my head because of blatant ghost writing and even though they have good tunes, I vomit retrospectively. So what is AI bringing to the fans of these genres that the fans might value? Because it's not authenticity nor is it skills. What is the point you're trying to make? I am saying on surface it might seem they should be the staunchest opponents and as I said the culture is "only cassette tape is real otherwise fuck off and die" but simultaneously its also one of the least image/player focused genres in some ways, what is being played is of much higher priority than who in specific is playing it.
donatj - 21 hours ago
hmokiguess - 8 hours ago
cestith - 6 hours ago
_dwt - 20 hours ago
abeindoria - 17 hours ago
vincnetas - 10 hours ago
_dwt - 15 hours ago
y0eswddl - 17 hours ago
idiotsecant - 15 hours ago
abeindoria - 16 hours ago
zarzavat - 15 hours ago
nikkwong - 20 hours ago
devin - 20 hours ago
CamperBob2 - 18 hours ago
devin - 22 minutes ago
hatsix - 17 hours ago
defrost - 16 hours ago
idiotsecant - 15 hours ago
taneq - 12 hours ago
sebzim4500 - 6 hours ago
CamperBob2 - 4 hours ago
y0eswddl - 17 hours ago
lubujackson - 18 hours ago
nikkwong - 14 hours ago
guitarlimeo - 12 hours ago
donkeybeer - 11 hours ago
guitarlimeo - 11 hours ago
donkeybeer - 11 hours ago