Can Chat GPT replace software engineers?

Can Chat GPT replace software engineers?

I had been using ChatGPT with GPT-3 as my junior engineer for coding personal projects for the past few weeks. ChatGPT is pretty good at executing a well-delineated coding task. The output typically needs a little tweaking but it is great to achieve tasks in domain I am not very familiar, and it saves me a lot of time from reading documentation. Interestingly, when coupled with GitHub copilot (also powered by GPT), some of the problems in the generated solution become apparent / fix themselves when actually writing the suggested code in an editor.

From my point of view as a front-end engineer, the web API is extremely wide in scope and nobody is able to quickly answer questions on “how do you do X” or “how do you do Y” on the more obscure aspects of the API. Well, ChatGPT can and while it’s not always right, directionally it’s usually pretty much on the money.

With GPT-4, I am much more confident in letting it handle slightly larger projects. It’s like my junior engineer got promoted! Yesterday, I asked it to create a VS Code extension that did a specific task. I wrote VS code extensions in the past, I love this kind of project but tbh I forgot everything about how to get started there. ChatGPT created my extension from scratch. Now, it didn’t work, but the scaffolding, which I think is the part I would dread the most if I had to create it from scratch, was perfect. I also asked it to create a small interactive demo with canvas. Again, the demo itself didn’t work as intended and what exactly is wrong is going to take a little bit of time to figure out but the overall architecture of the app is solid. One thing that stroke me as odd is that the generated code doesn’t have any comments, even though my understanding is that GPT-4 will translate the requirements into intents which will then be transformed into code, so, we could have a description of the purpose of the classes / methods.

ChatGPT is a fantastic accelerant to coding tasks. In a world where all software engineers do is when given precise requirements, produce code that implements these requirements to a T, then for most of us it would be time to explore other careers.

However, there are obvious concerns about an application entirely (or mostly) generated through ChatGPT. Let’s pause for a minute and think of what it is to build an app or a service. As an analogy, let’s try to think what it is to build a house. As a person who “experiences” a house, you could describe it as a succession of spaces. It has a front door, then it has a lobby, then there is a kitchen that has certain appliances in it, there is a living room, etc, etc. Now, let’s imagine a robot that would build a house space by space. Again – first the door, according to specs. Perfect, looks like a door. Then a lobby. Then a hallway. There’s a door in the hallway to a kitchen. There’s a door in the hallway to a living room. Wait. Is our house in construction going to make any sense? is it going to form a convex, continuous shape? am I going to be able to build a 2nd floor on top, a basement, etc. and have everything working well together?

The same goes for systems implemented with chatGPT feature by feature. The resulting code is going to be very brittle, and eventually need a substantial refactor at every step. One way to avoid this (same for the house metaphor) is come up with a sensible high-level plan or architecture. That’s still the realm of the human engineer. The other task is going to be able to evaluate what’s being generated at every step, and coming up with systems to make sure that we (humans + robots) are still building the right thing.

Anyway. Humans are not going away anytime soon and ChatGPT/GPT-4 are not at a stage where they can build a complex system from the ground up. But the nature of the work is changing and it’s changing more rapidly than most of us thought.

Scroll to Top