All I found was a human name given as the author.
We might generously say that the AI was a ghostwriter, or an unattributed collaboration with a ghostwriter, which IIUC is sometimes considered OK within the field of writing. But LLMs carry additional ethical baggage in the minds of writers. I think you won't find a sympathetic ear from professional writers on this.
I understand enthusiasm about tweaking AI, and/or enthusiasm about the commercial potential of that right now. But I'm disappointed to find an AI-generated article pushed on HN under the false pretense of being human-written. Especially an article that requires considerable investment of time even to skim.
I did not realize this was AI generated while reading it until I came to the comments here... And I feel genuinely had? Like "oh wow, you got me"... I don't like this feeling.
It's certainly the longest thing (I know about) I've taken the time to read that was AI generated. The writing struck me as genuinely good, like something out of The New Yorker. I found the story really enjoyable.
I talked to AI basically all day, yet I am genuinely made uneasy by this.
Without the inferred writer, it's much less interesting to me, except as a reminder that models change and I can't rely on the old tics to spot LLM prose consistently any more.
I find it interesting to ponder. We look at the luddite movement as futile and somewhat fatalistic in a way. I feel like the current attitude towards AI generated art will suffer the same fate—but I'm really not quite sure.
https://www.vice.com/en/article/luddites-definition-wrong-la...
> I assumed the writer was a journalist or author with a non-technical background trying to explore a more "utopian" vision of where trends could go.
If you assume you're reading something from a person with intention and a perspective, who you could connect with or influence in some way, then that affects the experience of reading. It's not just the words on the page.
but if you knew it came from a human it would be interesting as a window to learning what the writer was thinking
since there is no writer such window doesn't exist either
Read my comment below for a perspective.
With stories that shared experience is between author and reader. Book clubs etc will try to extend that "shared experience" but primarily it is author <-> reader relationship.
Remove that "shared feeling with the author" and what meaning does it have?
It means, "Wow. Cool. I'm a member of a species that taught rocks to think. Holy fuck. That's pretty insanely fucking awesome. Wow. Wow, wow, wow. Fuck."
That's about all it means. Nothing was removed from your life, but something optional was added.
And yet, in ironic counterpoint, there is a different artist I follow on Spotify that does EDM-fusion-various-world-genres. And it’s very clearly prompt generated. And that doesn’t bother me.
My hypothesis is that it has to do with how we connect/resonate with the creations. If they are merely for entertainment, then we care less. But if the creation inspired an emotion/reasoning that connects us to other humans, we feel betrayed, nay, abandoned, when it comes up being synthetic.
One of the many things I love about art is when I encounter something that speaks to emotions I've yet to articulate into words. Few things are more tiring than being overwhelmed with emotion and lacking the ability to unpack what you're feeling.
So when I encounter art that's in conversation with these nebulous feelings, suddenly that which escaped my understanding can be given form. That formulation is like a lightning bolt of catharsis.
But I can't help but feel a piece of that catharsis is lost when I discover that it wasn't a humans hand who made the art, but a ball of linear algebra.
If I had to explain, I guess I would say that it's life affirming to know someone else out there in the world was feeling that unique blend of the human experience that I was. But now that AI is capable of generating text, images, music, etc. I can no longer tell if those emotions were shared by the author or if it was an artifact of the AI.
In this way, AI generated art seems more isolating? You can never be sure if what you're feeling is a genuine human experience or not.
I think if you left it to its own devices, some of the narrative exposition stuff that humanized it would go off the rails
It's really interesting to hear about others that have been exploring generating fiction with Claude. I clearly need some more work based on some of the comments, but it has been really interesting discovering and coming up with different techniques both LLM-assisted and manual to end up with something I felt confident enough about to put out.
I'd be curious to hear more about your experience!
It felt like it was written by someone trying to quit an addiction to Corporate Memphis bullshit content spam. Like it came from some weird timeline where qntm was a LinkedIn influencer. It straddles an uncanny valley of being a criticism of the domination of The Corporation over human culture while at the same time wallowing in The Corporate Eunic Voice, not because it's a subversion of form, but because it knows no other way.
I then came to the comments section and found the piece that brought the picture into focus.
It's just... hard to explain the specific kind of disappointment. Perhaps there is a German phrase-with-all-the-spaces-removed kind of word that describes it succinctly. I feel like I exist in this Truman Show kind of world where everyone is trying to gaslight me into thinking LLMs are important, but they aren't very good at it and whenever I try to find out how or why, it all evaporates away. I was very reluctant to say that because I'm sure it's going to come with a heaping side of Extremely Earnest Walruses ready to Have A Debate about it and I just don't have the energy for it anymore. That's the baseline existence right now. It's like a really boring version of Gamergate.
And then this thing comes along. And yeah, it's a thing. You got me. Ha. Ha. Joke's on me. I lost the shitty, fake version of the Turing Test that I didn't even ask to be a part of. And it reminds me of the Microsoft Hololens: a massively impressive technological achievement that was ultimately a terrible consumer experience. Like if you figured out Fusion Power but it could only power Guy Fieri restaurants.
Ever since the pandemic I've been keenly aware of the complete destruction of every enjoyable social structure around me. The meetups that evaporated. The offices we essentially squatted in that suddenly turned Extremely Concerned about what people were doing. The complete lack of any social interaction at work because we're all so busy because we're running at half-workforce and pretty sure the executive suite is salivating at the bit to lay the rest of us off. The lack of care about how this is impacting open source software. The lack of concern for people.
I feel like my entire adult life was this slow, agonizing, but at least constant push forward into recognizing the humanity in others and creating a kind and diverse world and then over night it's all been destroyed and half the people I see online are cheering it on like it's Technojesus coming to absolve them of their sins of never learning to invert a binary tree. Where the blogs and books and startups of the early 2000s were about finding the hidden potential in people--the college dropout working as a barista who just needs someone to give them a chance to be a programmer or a graphic designer or an artist or whatever--the modern era seems to all be about the useless middle management guy who never had any creative bone in his body no longer having to write status reports to his equally mendacious boss on his own anymore.
We might be restarting old coal plants, but at least Kevin in middle management gets to enjoy "programming" again.
some inconsistencies that stuck out/i found interesting:
- HWY 29 doesnt run through marshfield, its about 15 miles north.
- not a lot of people grow cabbage in central wisconsin ;)
- no corrugated sheet metal buildings like in the first image around there
- i dont think theres a county road K near Marshfield - not in Marathon county at least
fwiw i think this story is neat, but wrong about farmers and their outlooks - agriculture is probably one of the most data-driven industries out there, there are not many family farmers left (the kinds of farmers depicted in this story), it is largely industrial scale at this point.
All that said, as a fictional experiment its pretty cool!
Really a great story, and to the extent it was AI-written, well... even greater.
I'm happily surprised (frankly amazed TBH) that the submitter didn't get bawled out by people flagging the post and accusing him of posting slop.
Can you elaborate on this?
Hard to imagine many occupations that have undergone more radical change in the recent past than farming. The profession is now utterly technology-dependent, and a few companies like John Deere have hastened to take unfair advantage of that. Hence the growing advocacy of right-to-repair laws.
But I was able to get through the text, it's pretty good, you did great work cleaning it up. There's just a bit more to do to my taste.
The story is good.
I am also extremely interested in thinking about where software development is going, so I really appreciated the ideas that went into this.
Since you seem open to feedback, I want to add that I felt the generated images were a negative addition. Maybe they wouldn't be if they also got a little polish - the labels in them were particularly bad.
And thanks for the note about the images, I'll take that into account! I only really just started this project and am going to keep iterating as I learn to use the tools better and I find the right visual language for it.
Since you seem in the mood to give feedback ;) If you take a quick glance at the previous story, do you feel the same way about the images in that one or was it just this one's that you found particularly unpolished?
“Yeah, I updated the silage ratios. What does that have to do with milk prices?”
“Everything.”
He showed Ethan the chain: feed tool regenerated → output format shifted → pricing tool misparsed → margins calculated wrong → prices dropped → contracts auto-negotiated at below-market rates. Five links, each one individually innocuous, collectively costing Ethan roughly $14,000.
Ethan looked ill.
--
I've re-read this a few times now, and can't work out how the interpreted price of feed going up and the interpreted margins going down results in a program setting lower prices on the resulting milk? I feel like this must have gotten reversed in the author's mind, since it's not like it's a typo, there are multiple references in the story for this cause and effect. Am I missing something?
[Edited for clarity]
The premise/structure/flavor of TFA is an almost pitch-perfect imitation of that kind of voice, to the point that I immediately flagged it as probably generated. I actually think a modern person would have some difficulty even in consciously mimicking it. There's an "aw shucks" yokel-thrown-into-the-future aspect to it. Plot-wise you have rural bicycle repair shop that expands operations to support nuclear reactors and that sort of thing. Substitute any of the more atomic-age stuff for AI stuff and you're mostly there. If you have some Amazing Stories from the 1920s on your shelf then you kind of know what I mean.
Which is totally fair, I'm honestly not! I haven't read much of that myself
It was the text equivalent of hearing a singer whom you know has perfect pitch sing atonal playground songs.
Take this sentence:
Tom had been an agricultural equipment technician, which meant he’d fixed tractors, combines, GPS guidance systems, and the increasingly complex control software that made modern farming possible.
Perfectly fine, a nice set up for a next sentence, but then you get hit with this:
He’d worked for a John Deere dealership in Marshfield for eleven years.
Bad. The rhythm is all off. Minor improvement:
For eleven years he had worked for a John Deere dealership in the nearby town of Marshfield.
Minor change, really, but the fluidity of the language matters a lot and just that one sentence written that one way breaks the flow.
It's almost as if a second person interjected and wrote that sentence like a friends annoying girlfriend who won't let him finish a story without adding in her parts.
But two notes does not a music make, so let's compare that 1 minor change with a before and after of all three opening sentences:
Original:
Tom had been an agricultural equipment technician, which meant he’d fixed tractors, combines, GPS guidance systems, and the increasingly complex control software that made modern farming possible. He’d worked for a John Deere dealership in Marshfield for eleven years. Then the transition happened, and the dealership’s software repair business evaporated; the machines still needed repair, but the software on the machines stopped being something you repaired.
Modified:
Tom had been an agricultural equipment technician, which meant he’d fixed tractors, combines, GPS guidance systems, and the increasingly complex control software that made modern farming possible. For eleven years he had worked for a John Deere dealership in the nearby town of Marshfield. Then the transition happened, and the dealership’s software repair business evaporated; the machines still needed repair, but the software on the machines stopped being something you repaired.
* this is a good attempt at a work of art, but written in a generic style that detracts from it * nobody making genuinely good attempts at art like this would also write so generically * and if they were making it generic on purpose, they wouldn't be able to do it so flawlessly * oh, it must be AI
I guess I can discern the presence of a human artist, but only in the idea, which just means it was a good prompt.
Prompts in, garbage out.
I'm mildly thrown off by some inconsistencies. Carol says "I’ve been under-watering that spot on purpose for thirty years," and then a paragraph down Tom's thoughts say "Carol didn’t know that she under-watered the clay spot." Carol considers a drip irrigation timer the last acceptable innovation, but then the illustration points to the greenhouse as the last acceptable illustration. Several other things as well, mostly in the illustrations.
Are these real inconsistencies or am I misunderstanding? Was this story AI-assisted (in part or all)? Is this meta-commentary?
I guess I'm also learning the value of working with an editor from first principles... over the last couple weeks before publishing I read through and made edits to this piece at least twice a day and still didn't catch this.
I don't think that phrase means what you are trying to say here.
What it doesn't mean: - learning by doing
I believe it generally means: a formalization that comes after a subject is understood so well that you can reduce it to "first principles" that imply the rest. Or, the production of a hypothesis by deduction from widely-accepted principles.
It's written like this is a dystopia but billing $180/45 minutes in rural low cost of living area sounds awesome. And the choreographer billing "more than a truck" for three weeks? The dream!
Well, then, you gotta move on to reading better science fiction. Because this is pretty damn bland. I gave up after 2 minutes because of it. Kinda feel vindicated after coming to the comments.
I can see it working for casual readers, which is why it's already an editorial problem. Imagine having to sift through a growing number of faux writers sending publishers AI generated prose.
However, I do wonder if it is a bit too hung up on the current state of the technology, and the current issues we are facing. For example, the idea that the AI coded tools won't be able to handle (or even detect) that upstream data has changed format or methodology. Why wouldn't this be something that AI just learns to deal with? There us nothing inherent in the problem that is impossible for a computer to handle. There is no reason to think AIs can't learn how to code defensively for this sort of thing. Even if it is something that requires active monitoring and remediation, surely even today's AIs could be programmed to monitor for these sorts of changes, and have them modify existing code when to match the change when they occur. In the future, this will likely be even easier.
The same thing is true with the 'orchestration' job. People already have begun to solve this issue, with the idea of a 'supervisor' agent that is designing the overall system, and delegating tasks to the sub-systems. The supervisor agent can create and enforce the contracts between the various sub-systems. There is no reason to think this wont get even better.
We are SO early in this AI journey that I don't think we can yet fully understand what is simply impossible for an AI to ever accomplish and what we just haven't figure out yet.
I feel like this ultimately boils down to something similar to nocode vs code debates that you mention. (Is openclaw having these flowcharts similar to nocode territory?)
at some point, code is more efficient in doing so, maybe even people will then have this code itself be generated by AI but then once again, you are one hallucination away from a security nightmare or doesn't it become openclaw type thing once again
But even after that, at some point, the question ultimately boils down to responsibility. AI can't bear responsibility and there are projects which need responsibility because that way things can be secure.
I think that the conclusion from this is that, we need developers in the loop for the responsibility and checks even if AI generated code stays prevalent and we are seeing some developers already go ahead and call them slop janitors in the sense that they will remove the slop from codebase.
I do believe that the end reason behind it is responsibility. We need someone to be accountable for code and we need someone to take a look and one who understands the things to prevent things from going south in case your project requires security which for almost all production related things/not just basic tinkering is almost necessary.
I've mostly been digging through my own version of that and trying to find things I find interesting and seeing what kinds of stories we can build about what a day in that job might look like.
For the exact same reason why there is absolutely no technical reason why two departments in a company can't talk to each other and exchange data, but because of <whatever> reason they haven't done that in 20 years.
The idea that farmers will just buy "AI" as a blob that is meant to do a thing and these blobs will never interact with each other because they weren't designed to(as in - John Deere really doesn't want their AI blob to talk to the AI blob made by someone else, even if there is literally no technical reason why it shouldn't be possible), seems like the most likely way things will go - it's how we've been operating for a long time and AI won't change it.
Or you can ask the agent to do this after each round. Or before a deploy. They are great at performing analysis.
I don't know if this is what the future will look like, but this looks realistic. And if my non-existent grandson starts re-coding my business without asking, he's going to spend the next six months using K&R C.
It's a long article and from skimming I see chat of farming, software, GPS. I can't tell whether this is worth investing time to read if I can't even tell what it may be about
Edit: got it right!
30 minutes ago it was on the front-page, now I can't find it listed in the top 200.
what was my surprise when I read it was AI-generated
> If the world must change, I hope at least we still tell such stories and share how we feel within that change. If so, come what may, that's a future I know I can live in.
This is really the whole idea behind this project with Near Zero. I think there's a lot of anxiety out there around AI and the future, I was there for a while too. Ultimately I've ended up pretty optimistic about it all, and inspired by what the group at Protocolized is doing, found science fiction a great way to help express that.
Just saying that everything is going to go to shit and one or two corporations will take over everything... Maybe, but I've heard that story already.
But on reflection and discussion with the author, we decided that enough HN users may find that it gratifies intellectual curiosity, because it's interesting to see how a human and an AI bot can collaborate to create writing like this.
We just asked the author to write an introduction to make it clear it's AI-generated and explain their process.
FTFY
edit: since im being downvoted, i clarify my interest to be in the human artistic input behind this mathematical model output
My short answer to “why should I care about the mathematical model output from the human artistic input” is “I think we’re all figuring that out right now!” And I’m pretty sure the answer isn’t “you shouldn’t care at all”. Especially if the mathematical model output from the human artistic input expresses what the human wants to express at a quality level that passes that human’s “Taste Gap” (https://www.youtube.com/watch?v=91FQKciKfHI)
I’m sure we could go back and forth about this a lot (and happy to keep this conversation going, I truly do feel like exploring and discussing, this is very interesting to me!) so happy to dig into any aspect with you :)
I will say that I think what’s happening is that we’re seeing more people explore art forms that couldn’t before because of mechanical skill gaps, and that’s interesting in the same way that synthesizers and sampling and software instruments did to music or I imagine digital art tools did to physical art, and I imagine digital photography did to photography which did the same to painting. It’s an interesting time to be alive!
regarding your personal input, this is an order of magnitude less imaginative compared to tapping some keyboard keys. it's not your imagination that produced the majority of this story; it's unfair to claim any aspect of this process except your prompts. which is why i asked for the prompts. im not here to hate on your artistic expression, just as im not here to listen to the sum total of humanity's creativity that has been poked and prodded into maximising shareholder value. some people might be interested in that - frankly i doubt they would be, if they empathized with a painter or writer or producer (or had any clue how easy it is to manipulate humans). me myself, im here for your creativity and yours alone. not that of anthropic (who, like other AI companies, stole it).
by pushing out this work, theres nothing stopping you from having inadvertently acted as a conduit for a corporation to deliver its message. how do you know that you havent accidentally pushed out a work with hidden messages embedded within? do you know how good llms are at encoding and decoding hidden meaning?