Logo

How do you envision the role of AI in software development evolving in the future?

Last Updated: 24.06.2025 09:52

How do you envision the role of AI in software development evolving in the future?

We’re still waiting to see how the dust is going to settle IMO.

In the “pre-copilot era”, there was a general push towards code quality, as in: developers were nudged into making code that was easier to maintain by their fellow developers. Code quality is going to evolve into: code that AI agents find easy to work with. Those two things are not incompatible, but it means things like more comments, more tests.

a larger part of the code in codebases is going to be generated. This doesn’t mean that a large portion of the tasks that were once handled by humans can be entirely delegated to AI, but rather, in a typical commit, an increasingly large proportion of the lines of code changed will be done automatically.

I’m wondering about attachment and transference with the therapist and the idea of escape and fantasy? How much do you think your strong feelings, constant thoughts, desires to be with your therapist are a way to escape from your present life? I wonder if the transference serves another purpose than to show us our wounds and/or past experiences, but is a present coping strategy for managing what we don’t want to face (even if unconsciously) in the present—-current relationships, life circumstances, etc. Can anyone relate to this concept of escape in relation to their therapy relationship? How does this play out for you?

the introduction of code completion tools (github Copilot etc. ) which liberate devs from memorizing precise syntax,

We are entering a new phase of uncertainty. In the late 2010s/early 2020s (“pre-copilot era”) the developer experience was concentrating around fewer tools with large adoption. Now the market for these tools is fractionated again.

developers will spend less time typing code and more time thinking about code. ie describing their projects. Discussing what they want to achieve with an agent, which requires reasoning and formalizing what they want to accomplish.

Starbucks Enters the Protein Wars With Test of New Drink Option - Bloomberg.com

conversational LLM agents (chatGPT, claude etc.) that can accelerate research, simulate brainstorming and perform small technical tasks,

In the past 3 years there’s been 3 pivotal moments:

Developers will spend more time on quality insurance, both upstream and downstream. Thinking - how should this piece of code integrate in the larger whole. What are the signals that it’s broken. What logs, testing, monitoring and alerting should I put in place.

Former church in Northern Kentucky now a heavenly home for sale - WWMT

agent-centric IDEs (cursor, windsurf, claude code…) which empower agents to reason with an entire codebase and provide more actionable answers / perform more useful tasks.

I think that “vibe coding” ie giving a brief description of what you want to achieve and get fully functional code as a result is going to have very limited impact. It works, yes, but in very specific cases, but it doesn’t scale well, and the economies it creates are not worth the trouble in the general case.

The trends I expect to continue are:

Why are Christians quick to say that there are a lot the gay Christians that exist NOW and use that to pretend that Christianity is just loving to gays when the last 40 years of my life they been horrible?