I think there are very few roles where someone has a nice clean set of inputs and outputs and a clear connection between them. Software development (at times) is amongst the cleanest, because all the inputs are typically computer friendly, and I think that's why GenAI has had a lot of traction in our industry.
I therefore believe that even with incredibly advanced AI, there will still be a huge amount of work to because the world simply isn't as neat as people imagine it is. In other industries this will be even more true.
Unless it learns to make every single email or report NOT a wall of text and reusing the same AI-written telltale constructs (it’s not A. It’s B. You’re absolutely right - the key insight is that I’m writing to express my FEELING). Probably not much better than they do now.
> So companies have the choice of paying Chris $84,000 plus a whole bunch of benefits for 40 hours of mediocre work, or they can pay probably $100-$1000 for an AI
What makes you think they won’t price the AI much closer to what Chris was costing? They know the employer already pays that cost and the premise here is the AI works better and 24/7 (service outages notwithstanding).
> I think what we get on the other side will be far more human and meaningful. Humans building things and sharing value with other humans doing the same.
I want my AI to do dishes and laundry so that I can write and paint. Not for it to write and paint so I can do dishes and laundry.
Both companies have models "too dangerous to release". Both companies' girlfriends go to another school.
This will be the next step, and it will be nasty: changing our workflows to "agent-friendly" even if they less convenient for humans. And then - yes, partial replacement.