I am penning down some thoughts on LLMs. Years from now I want to look back and see how I thought about things. It will be humbling, I know. Hindsight is liar. I don't want to delude myself into thinking that I always had the right idea. In truth, I have no idea where the industry is headed.
These are my predictions, and it's O.K. that they are wrong.
LLMs and Software Engineering
- Wages will go down. In part, due to a recession. In part, due to discourse about engineers being replaced by AI. That fear is enough to drive wages down. Companies would love to pay less for software expertise. They are eager to believe the story the AI industry tells them.
- AI is not merely a convenient excuse to fire people (I've seen this argued), executives actually believe it will replace people.
- It will lower the barrier to writing software. This will also exert downward pressure on wages, though the newly-minted developer will quickly hit a wall where they will have to learn details. At that point our extant self-selection process resumes. You've got to be detail-oriented and patient to write software. Woe to the vibe coder who isn't!
- At the top level of the industry wages will remain high. Critical software that drives revenue will still need to run or be updated as reliably and as quickly as possible. Companies will want the best for critical systems, and the influx of new engineers won't actually increase supply for this role.
- Software engineering AI tools will become more elaborate. Cursor, Aider, and Copilot are just the start. We will see attempts at full feature development. The human involvement is now in guiding/specifying what the AI should do. It will work sometimes.
- This will make internal tools cheaper. Good!
- This will make personal projects cheaper. Also Good!
- Entire codebases and features will be built with AI tools. Then the tool will hit a wall. At that point you'll need to bring someone in to fix or implement features which are too complicated to get right. This could be a firm or person with their own suite of AI tools. This will just be part of the job of established software companies.
- The legacy codebases of the future will be awesome. I can't tell you how, but I'm looking forward to the monoliths AI constructs.
- Most products will get worse, a few will become excellent. This is already true. It will just happen faster.
- The number of companies with a technical advantage will drop. When moats exist they will be legal, monopolistic, or social. Technical advantages will require hardware, specialized algorithms, specialized optimizations, or other niche techniques that are not easily found in training data.
- This also means individuals will write software to compete with existing software products.
- LLM API calls will remain affordable, or get cheaper. Thank you open weights.
- LLMs will not fundamentally change. I don't think they will suddenly get smarter without expanding their training data into a given domain. They will get better, but they will get better at the things they are already decent at. They will not become sentient, or discover novel algorithms, or cure cancer. That will require some novel approach.
- LLMs will get better at writing software. Training data is easily obtained and there is money in replacing software engineers.
- But LLMs will have difficulty with race conditions and data consistency.
- LLMs will never be funny... intentionally.
- We will see fascinating security research.
- Open AI will go out of business (or be acquired by Microsoft)
- Free/Open software will find ways to use these models to develop software. Some people will complain, but LLMs will prove useful for projects maintained by 1 or 2 contributors.
- Someone will invent an AI-first programming language, it will fail to gain momentum because no tooling companies will adopt it.
- Python and Javascript will be around for at least 30 years, and will be the main target of LLM projects. I hear Rust and Cursor work well together. So maybe it's just that current popular languages will persist longer as a result of LLMs.
- Libraries and languages will be developed with LLMs in mind.
- Unless we do something, the Earth will heat up. We will experience continued biodiversity loss. Our descendants will live on a ghost planet.