

10·
23 hours agoI wouldn’t be surprised if this is already the case, depending on your definition of “code”. After all LLMs can spit out code-looking text at a rate much faster than any human. The problem comes when you actually try using this code for anything important, or worse still when you try to maintain it going forward. As such, most code in projects that actually matter will probably be either created, or at least architected and carefully guided by humans for quite some time still.
To be fair, LLMs can be quite useful tools to fill the gaps around traditional tooling for writing and coding. But I agree with you that they will never become AGI, just by their very design.