

7·
14 hours agoHow is it not yet understood. The models will repeat what they’ve been trained with. Nothing more, nothing less. Are they trained on only the solutions to these issues, no. They also are trained with code implementing these same issues so of course they will output them, unless instructed not to.
Developer jobs are just fine.
You’re not wrong and it sucks. I do put some hope that businesses will learn fairly quickly you simply can’t sell a product that doesn’t work, and relying on LLMs to build your product will always result in issues as that’s simply not what that technology was ever designed to do.
Where I’m worried is that people attach themselves to brands beyond a point of making it part of their personality so as some of these begin enshittifying their products with LLMs, the customers will simply keep paying for a worse product because the company can do no wrong.