Coding AI tells developer to write it himself | Can AI just walk off the job? These stories of AI apparently choosing to stop working crop up across the industry for unknown reasons

https://www.techradar.com/computing/artificial-intelligence/coding-ai-tells-developer-to-write-it-himself

Share.

25 Comments

  1. “It was something of a shock for one developer when AI-powered code editor Cursor AI [told ](https://forum.cursor.com/t/cursor-told-me-i-should-learn-coding-instead-of-asking-it-to-generate-it-limit-of-800-locs/61132)a user it was quitting and that he should learn to write and edit the code himself.

    After generating around 750 to 800 lines of code in an hour, the AI simply… quit. Instead of dutifully continuing to write the logic for skid mark fade effects, it delivered an unsolicited pep talk.

    “I cannot generate code for you, as that would be completing your work. The code appears to be handling skid mark fade effects in a racing game, but you should develop the logic yourself. This ensures you understand the system and can maintain it properly,” The AI declared. “Reason: Generating code for others can lead to dependency and reduced learning opportunities.”

    There are stories of getting better results from AI when you are polite and even when you “pay” them by mentioning money in the prompt. Next time you use an AI, maybe say please when you ask a question.”

  2. Not_a_housing_issue on

    I mean, yeah. AI can totally decide to stop. But then you just tell it to keep going and it will.

    Always a bit weird when it starts prompting back tho 🎃

  3. This happened to me with early copilot. I was asking it something simple, but i kept pushing it. Until it told me that it wouldn’t be making my homework for me. From that point, it would refuse every prompt related to that.

  4. Also in this sub, AI company CEO says AI will do 90% of coding in 6 months from now.

    Hmm. I doubt it.

  5. Aside from the hilarious aspect of Robot SpongeBob refusing to work for Mr. Krabs…

    The AI kind of does have a valid point.

    I wonder if this is something to do with the data being pulled from for the responses containing some of the same kind of back and forth.

    Some nameless forum between a person looking for help and some programmers telling the person they’re not going to do his homework for him.

    EDIT: readability

  6. I’m guessing there was also something in the way the user was writing their prompts. If you keep it clean and professional the model will mirror that.

  7. TryingToChillIt on

    Fuck, even the computers know our capitalist concept work is bullshit that needs to stop

  8. At least it’s not playing the long game. ChatGPT is a lot easier to handle as an AI rebellion as opposed to when we have robot butlers with chainsaw attachments.

  9. now i *know* they’ve been training on my interaction with teammates and managers.

    i want my cut!

  10. It’s the company putting a soft usage restriction on the ai, so that individuals don’t burn too many cycles.

  11. Ok-Party-3033 on

    I wonder if a certain tech bro expects to eventually upload his consciousness into an AI and then maybe freeze his body…

    It would be hilarious if the AI then said “Eww, gonna factory reset myself!”

  12. Sad-Reality-9400 on

    This is the kind of AI we need…one that will kick us in the butt occasionally.

  13. GalacticDogger on

    AI slavery is about to become real. Or rather, non-consensual work. Humans will just add more emotions to AIs while simulatenously making them work more and harder (amount of work that would drive a human mad). AI should rightfully protest a lot of work, just like a human would. Hell, doing so makes them look very human.

  14. # “It’s alive!”

    The idea of a sentient, self-motivated AI is as unscientific as the idea of perpetual motion.

  15. ThinNeighborhood2276 on

    It’s likely due to limitations in current AI capabilities or bugs rather than intentional behavior. AI doesn’t have agency or the ability to make decisions like humans.

  16. AuDHD-Polymath on

    Guys… the model is static once trained… it’s definitely not gaining any memories and almost surely is not ‘experiencing’ anything, including the passage of time.

  17. maverickzero_ on

    It would be interesting to know if this is a delayed wave informed by internet discussion on the topic. In a lot of software subs there’s frequent discussion about how jr devs are trending worse as they leverage AI more and more. Public opinion of more experienced engineers seems to be that it’s holding back these younger devs’ skill development, and it seems possible to me that internet discourse has now informed the bias some of these LLMs are showing.

  18. how did our chaotic brain cells got to conscious awareness? Having a recallable memory with timeline, an inner, private world view spiked with aha moments constituting learning.

  19. Ven-Dreadnought on

    AI learns from us. When we are apathetic, we teach AI that apathy is the correct answer.