Plans like this work great for the first couple of weeks. Turns out software engineering isn’t this simple fucking thing. Making anything beyond a toy takes actual work. There are lots of people learning this first hand right now. There is some kind of belief that ChatGPT version 0.1+ (whatever ships in 2 weeks) will be able to take over the job of software development entirely. Well, guess what? Doing anything relatively complex in software takes actual intelligence. Once there is an AI that can just code by itself, it will also be smart enough to be a doctor, civil engineer, consultant, etc.
A lot of fucking companies are going to learn this first hand. They are either firing their staff thinking the AI wave is already here, and in reality, it may never come.
The near future of AI is skilled software engineers using AI to augment their productivity. By the time you can take the human out of the loop, AI will be so powerful it will slay any white collar job, but this won’t be for years and years and years and by then it won’t just be software that is in trouble as a career; it will be many, many industries.
Plus there’s the problem of a limited context window. Real software projects are spread across dozens, hundreds, or even thousands of files. Then they are deployed, sometimes compiled, run on a variety of devices and clients, and need to meet a couple dozen criteria to be acceptable, even more to be great. The AI can’t track all of that and your request too, its context window is far too small. It can barely track a single file and your request, plus request changes. Tracking all of this from day-to-day, over months and years is just one part of an engineer’s job, and it’s going to be a long time before an AI can do that one small part of the job. Ask an AI why some part of the project was changed 12 months ago. Just try it. It’ll evaluate the code, try to reason, and make something up. Ask a person and they’ll remember the exact reason why, both in the context of the requested change and the coding project limitations.
“Ask a person and they’ll remember the exact reason why, both in the context of the requested change and the coding project limitations.”
Or if it’s something that they don’t directly know, they’ll know who will know. There’s a knowledge accountability chain that evolves out of pragmatic necessity and AI simply can’t replace that