In the last year (and a half) LLMs have taken by storm our digital lives, especially jobs that require working with a computer. Here I attempt to collect some ideas on how I think LLMs (will) impact programming hoping that this is just a new age of software development and not the end of it. These thoughts derive from personal experience in the last months, but for sure the several blog posts I have read written by more experienced people about the topic had an influence.

Note for finicky readers: LLMs, GPTs, ChatGPT and Copilots are used interchangeable even if they are different things.

Context

Software is ultimately used for automating (boring) stuff. Developers have for a long time felt pretty sure about their job and the their ability to turn code into money. Now that ChatGPT and Copilots have entered the field it seems like the future is not all sunshine and roses and this can cause a little anxiety.

copilot-dev-automating

I believe we can look at this situation from two points of view: the tool and our skills.

(Just a) Tool

After all is a huge innovative tool, but is still a tool.

I am very bad at taking pictures and giving me a Reflex does not increase my ability. I can take a picture, but I have not a great ability to select the right angle or understand the best light for the subject. The fact that my smartphone has a (set of) camera(s) does not make me a freaking photographer. On the other hand, someone that is good at it can make wonders even without an expensive Reflex.

Copilots are both the Reflex and the smartphone cameras. They are Reflex for their ability of generating high quality result and smartphone cameras because anyone can use them even without proper knowledge and for how they make accessible AI and programming.

Another appropriate definition that I have heard when Microsoft presented its fleet of Copilots at Ignite this past March was “it is like having half-junior and half-senior at your disposal”: junior for writing all the boilerplate code, senior for the ability of focusing on a specific complicated detail when asked for. LLMs lowered the barriers to write code, but you have still to know how to debug issues and understand the trade offs of the proposed solution.

In the end it’s the developer that takes the responsability for the code (or is git blamed). Therefore is fundamental to understands LLMs limitations and how to leverage them.

Indeed, being a tool is important how we use it and our attitude towards it. And this brings the topic of programmers.

Programmers

Since it’s a new technology we could decide either to use it or to avoid it. Mastering the use of LLMs for coding, or anything where writing is concerned, can open up a realm of possibilities and I am not talking about the positions for Prompt Engineer.

For instance, we can leverage GPTs to experiment with new programming languages or part of the stack we are not familiar with. Personally I have been able to work on some (small) full stack projects with the help of LLMs. For sure the frontend of While Model Trains could be better, but without help from ChatGPT I would be stuck with a way worse solution (or none).

Moreover writing code is just one part of a programmer job. You need to understand the requirements, know the domain, implement, deploy, test, document and make your uncle’s printer work. While implementing, writing tests and even documenting it can be carried out by LLMs (with varying degrees of quality), understanding the requirements, knowing the domain of application and discovering how to deploy the final solution that your cloud provider did not document clearly is still something humans do better.

The (eventual) commoditization of code may cause less demand for programmers, but skilled programmers will be even more important. One could buy a suit for a couple hundreds bucks in a store (LLM code), but if he wants something that fits him perfectly (solution for a specific use case) he is willing to pay good money for a skilled tailor (good developer). Trust me I had to buy the suit for my wedding last year!

Educated Guesses

Who does not like to try to guess what will happen? Since we believe we are good at it, let’s share some guesses so we can see in a year (or 10) how many of them are wrong. Probably some of these bets are already realized between the draft and when this piece is published given the speed of innovation in the field.

  • Domain knowledge will be even more important
  • People will learn to express themselves better to fully use LLMs
  • Ability to use LLMs will be taken for granted as the use of git and code editor
  • No-code will succeed since the code will be natural language
  • Low-code platforms will suffer since people will either prefer to write in plain English or twist code written by Copilots
  • The printer will not work