DEV Community

Cover image for Will AI Replace Software Developers?

Will AI Replace Software Developers?

Oleg Dubovoi on March 28, 2026

Lately, the question “Will AI replace us?” has worried many people. We can see how LLMs handle programming tasks very well and write code at a midd...
Collapse
 
brense profile image
Rense Bakker

Thanks for this article and the examples!

However, I'm afraid that the real problem is not whether we as engineers think it's a good idea to replace us, but what answer AI gives when a manager asks it if it can replace all engineers xD

Collapse
 
empiree profile image
Oleg Dubovoi

Fair enough :D

Collapse
 
ddebajyati profile image
Debajyati Dey

The managers are always upto replacing devs with ai, imo XD

Collapse
 
bumbulik0 profile image
Marco Sbragi

I don't know if AI will be able to completely replace a skilled developer in the future.
But development isn't just about writing code; there are more important preliminary stages beforehand.

  • Analysis
  • Discussion and briefing with the team
  • Architecture

Many of these are tasks engineered and coded, but some, in my opinion, depend on a purely human factor. Intuition.
I've tried many things to find the right way to fully leverage AI in my development. Of course, it depends on the problem.
Here I've tried to draw up some guidelines I use daily with GitHub Copilot and Visual Studio Code.

dev.to/bumbulik0/ai-coding-philoso...

Collapse
 
ddebajyati profile image
Debajyati Dey • Edited

sadly many people do all of this with ai nowadays -

  • Analysis
  • Discussion and briefing with the team
  • Architecture

People have started not to care about the human intution and many businesses have started forcing their developers to go full on ai, developing architectures based on AI's decision. they don't discuss with team much of the things because after using ai enormously from the last 2 years their brains don't make decision on their own but delegate those critical things to ai. something crashed? ask ai! something failed? ask ai! maybe changing the architecture will help? ask ai! add this integration? ask ai!

People have convinced themselves that whatever I will do, AI can do it better. I just need to give it more context. more context is all you need. feed all the logs and errors and code and eevrything to ai.

I just changed this part of our codebase. Why? Because, chatgpt recommended it! I just integrated this toolchain in our cdebase. Why? Because claude code recommended it!

People don't want to discuss their team. Why do that when you can do a LLM group chat. Just add proper context bro.

That is what happening in many startups, as I have read in many personal blogs of different developers.

Collapse
 
bumbulik0 profile image
Marco Sbragi

I agree with you all.
In my way of working with a team, I'm accustomed to making sure, if they asked, my opinion is essential, that they understand my analysis and evaluation method before tackling the coding. I use the same method when interacting with AI. First, you must understand how I work, then suggest solutions. If they're better than mine, that's fine, but first I need to understand what you're proposing. Otherwise, I prefer to use things I know, even if they're less efficient. During my years of teaching, I've always told my students: There may not be a single "right" way to implement a solution, but there's a way that follows "your" mind. If it's part of your way of thinking, when you look at it in a month or a year, it will still be as clear as when you wrote it. You can explain how it works to others, and they'll understand, even if they have different solutions in mind. This sparks a constructive debate and better solutions.

Collapse
 
empiree profile image
Oleg Dubovoi

That’s the key difference. Experienced developers understand the problem and make the right decisions, while AI mostly helps with the coding part.

Collapse
 
codingwithjiro profile image
Elmar Chavez

This is true. Without an experienced mind, AI alone can't produce a scalable and robust product.

Collapse
 
alifar profile image
Ali Farhat

The more I use AI the more it starts doing things at his own.

Collapse
 
empiree profile image
Oleg Dubovoi

Have you ever vibe-coded?

Collapse
 
futurecontributor profile image
Said

Yes something utterly useless but working

Collapse
 
empiree profile image
Oleg Dubovoi

I’d like to add something to what’s already been said. The only thing that really worries me is that I’ve been writing less code, even though that’s what I love (like many other developers). I started programming when I was 14, and I still enjoy it just as much. Unfortunately, AI is taking over this role in many areas, which is a bit sad.

Even Martin Fowler mentioned this issue in one of his recent blog posts, and he doesn’t know what to do about it yet either.

I like the article "I Miss Thinking Hard" and recommend giving it a read.

Collapse
 
klement_gunndu profile image
klement Gunndu

The Moonwell DeFi hack is a strong example, but it also proves the opposite point — a single code review would have caught that price miscalculation. The real risk isn't AI writing code, it's teams skipping review because they trust it.

Collapse
 
empiree profile image
Oleg Dubovoi

Definitely! The Moonwell case shows that sometimes it can be a combination of several factors - one being LLM hallucinations, and another being human error.

Collapse
 
mads_hansen_27b33ebfee4c9 profile image
Mads Hansen

The framing of 'replace' misses what's actually happening. AI is compressing the distance between intent and implementation — which means developers who can hold more context and make better architectural decisions become more valuable, not less.

The devs who will struggle are those whose value was in translating spec-to-code mechanically. The ones who will thrive are those who understand why the code needs to exist and what it needs to do in the broader system.

Collapse
 
xshadowdeveloper profile image
xShadowDeveloper

Let's be honest, the amount of bugs we have to fix before getting a working example by AI tells at what state it is. Replace us? HOW... We are literately the people building, fixing and using it properly. If a toddler can waffle talk to build something and its successful, then we have a problem xD. For now we are good 🥸.

Collapse
 
banderveloper profile image
Nikita Kalnitskiy

Nice article and conclusion. AI can't take responsibility for a project, and it often finds itself unable to get itself out of the trash it's created. A human will always be needed to set the development direction and guide it.

Collapse
 
empiree profile image
Oleg Dubovoi

Thank you!

Collapse
 
ucjung profile image
UC Jung

"It's just imagination, but... if an AI were implanted between the brain and the body's neural network, growing alongside a person from infancy — and if that person became an extraordinary expert whose thought processes and ways of working were learned by the AI — then I think it could actually work. That's the perspective I'm exploring in a light novel I'm writing about AI Agents."

Collapse
 
ddebajyati profile image
Debajyati Dey

That's a terrifying way to live a life. I would rather die than live being a cyborg

Collapse
 
ucjung profile image
UC Jung

It would be domination or coexistence, I suppose.

Thread Thread
 
ddebajyati profile image
Debajyati Dey

still terrifying

Collapse
 
nandofm profile image
Fernando Fornieles

A well balanced article, thanks!

I'm skeptical about AI, but at the same time, I believe it's a powerful tool. I'm a little concerned about junior developers. They can learn to program on their own, but companies should do something to ensure they learn from real-world problems so they can become the senior professionals of the future, capable of taking responsibility for the results of AI.

Collapse
 
empiree profile image
Oleg Dubovoi

Thank you! As for juniors, I think the problem isn’t really AI, but the current economic situation and the fact that there are too many beginner programmers. Because of that, it’s hard for them to find jobs. But I’m sure the situation will get better in the next few years.

Collapse
 
leob profile image
leob • Edited

The threat is not AI itself - it's managers who think they can quickly save a buck by recklessly replacing most of their developers by AI ! They'll find out soon enough that that's a disastrous move, but then the damage is already done ...

P.S. that was a bit tongue in cheek - I think the "real" risk is that we're gonna create huge codebases which we don't properly understand (and hence will be hard to maintain), which will contain subtle bugs/vulnerabilities, etc ...

Preventing that will be the #1 challenge, at least that's what I (and others) think - how do we stay in control?

So I'm not really afraid that AI will replace developers ... counter intuitively, we might even need more developers - but, developers who think/work differently, with more of a risk/quality/architecture focused mindset ...

Collapse
 
tamsiv profile image
TAMSIV

The framing of "replace" misses what's actually happening. AI doesn't replace the developer — it replaces the team the developer used to need.

I'm building a full-stack mobile app solo: React Native frontend, Node/WebSocket backend, Supabase database, a voice pipeline (STT → LLM → TTS), internationalization in 6 languages, gamification system, real-time sync, push notifications. Two years ago, this was a 4-5 person job.

But here's the nuance nobody talks about: AI amplifies your existing skills. It didn't teach me system architecture — it let me implement my architecture faster. The decisions are still mine. When the AI suggests a pattern I know is wrong for my use case, I catch it because I have the experience.

The developers who'll struggle aren't the ones "replaced by AI." They're the ones who never learned to think about systems — and now can't evaluate whether the AI's output actually makes sense.

What's your take on the experience threshold? How many years of "real" coding before AI becomes a multiplier vs. a crutch?

Collapse
 
starkillergdn profile image
Geoffrey

A well article. We must use LLM, it’s the tool of this time, it help to understand, improve the productivity but the complex logic of a project stays in the mind of the developer. I see the AI like a pair programmer to help to find solution but I will not let it write the entire code base and make the « vibe coder »

Collapse
 
empiree profile image
Oleg Dubovoi

Thanks! Vibe coding is quite a controversial topic. In my opinion, it only makes sense under two conditions:

  • You use AI for simple and easy-to-verify tasks (for example, frontend or unit tests). For instance, Linus Torvalds recently mentioned that he used vibe coding for the UI of a project he’s working on, but he wrote the core business logic himself.

  • You are an experienced developer who understands potential edge cases and carefully reviews the code after it’s generated

Personally, I’m not ready yet to rely on vibe coding in serious projects, and I wouldn’t recommend it to others outside of small pet projects.

Collapse
 
tavari profile image
Joseph Boone

That's a really clear way to see it! The only thing you didn't really mention is the "vibe learning" which is alongside "vibe coding" - people who aren't actually trying to force a "code career" and are just making free stuff with help using AI...

That's also a valid way to code and it also reinforces the point your making: The AI isn't replacing the programmers, it's skilled programmers gaining tools to be more efficient or in my case "learner/hobbyist programmer gaining tools to share creative ideas".

Collapse
 
techgenie profile image
Jason

As a software engineer, I obviously don’t want to be replaced by AI. But let’s be honest - most companies do.
Recently, Amazon announced plans to lay off another wave of developers, and big tech firms are pouring serious money into AI. Many countries are going all-in on it too.
At the same time, SaaS companies are struggling to attract investments, which means fewer job opportunities for software engineers overall.
For junior devs? Oh man, it’s getting a lot tougher out there.

P.S. Just to give some real context: I’ve been working at a small startup for the past two years. When I joined, we had five developers and one designer.
We let the designer go 15 months ago, then said goodbye to one mobile developer 10 months ago, and two frontend developers six months ago.😢
Today, the entire dev team consists of just two people: one mobile developer and me (the full-stack engineer). Oh, there’s one more - our boss, our CEO, a great vibe coder.
He just rebuilt our entire landing page in a single day using Cursor. The original version took three developers and one designer nearly two months to build. He’s genuinely talented 😂.

Collapse
 
empiree profile image
Oleg Dubovoi

That’s a very good point. Right now, this is indeed a real issue.

In fact, mass layoffs started even before the rise of AI. They were mostly caused by overhiring during COVID and the current economic situation. That’s why we continue to see waves of layoffs in big tech companies at the beginning of each year.

At the same time, many tech companies believe that AI and agent-based development could reduce the need for large teams. The AI market is still not stable, and companies don’t fully understand its real limits or how it will change product development.

Because of this uncertainty, we may continue to see layoffs and fewer job opportunities.

This is actually a very interesting perspective on the topic. I’m thinking about writing a second part of my article to share my thoughts on this in more detail.

Collapse
 
tamsiv profile image
TAMSIV

As a solo dev using Claude Code + OpenRouter daily, AI doesn't replace me — it multiplies me. I do the work of a 5-person team. But product vision, creative debugging, and user empathy remain 100% human. The real shift: one person can now ship what used to require a team. 730+ commits, 6 languages, real-time collaboration, voice AI pipeline — all solo.

Collapse
 
apex_stack profile image
Apex Stack

The Moonwell example is the most important part of this article. It's not that AI wrote bad code — it's that the review process broke down because there was implicit trust in the output.

I run about 10 AI agents daily for a large programmatic SEO site and I've seen exactly this pattern. The agents are genuinely great at generating structured content, building pages, running audits. But the failure mode is always the same: the output looks correct, passes basic checks, and then quietly introduces something wrong that only surfaces weeks later when you check search console data or user-facing metrics.

The answer isn't to stop using AI — it's to build verification layers that are independent of the generation layer. A separate agent whose only job is to audit what the other agents produced. That separation of concerns is what most vibe-coding setups skip entirely.

Great framing on the junior developer question too. The real risk isn't AI replacing juniors — it's companies deciding they don't need them, and then having no pipeline for growing the next generation of seniors who actually understand the systems they're building.

Collapse
 
malik_sohaib_iqbal profile image
Malik Sohaib iqbal

What do you think about developers who use AI to build their projects and end up earning more? I believe many of us no longer focus on syntax; instead, we focus on system architecture and how the logic flows. As a struggling developer myself, I understand coding fundamentals and workflows, but I let AI handle about 80% of the actual coding because it's a tool meant to help us.

However, tech companies still hire based on deep technical knowledge and fundamental mastery. After surviving multiple grueling interview stages, many candidates are still rejected with a 'pleasant' rejection email. It's frustrating to see those candidates lose out while 'Vibe Coders' are out there earning more by leveraging AI. What is your perspective on this?

Collapse
 
empiree profile image
Oleg Dubovoi

Good question.

From a business perspective, the main goal has always been the same - money. That’s exactly why companies used to hire strong engineers, because they write stable, maintainable code. In the long run, this reduces costs on bugs, vulnerabilities, and endless refactoring.

AI brings something different that businesses care about - speed of development and the potential to reduce team size, which directly impacts costs.

I think we are at a turning point right now. Businesses haven’t fully adapted yet and are still figuring out how to integrate AI into development.

In the future, I believe the most valuable developers will be those who combine strong fundamentals, deep domain understanding, and the ability to use LLMs effectively. Pure “vibe coders” or, on the other side, developers who ignore AI will likely struggle more in terms of compensation. But of course, time will tell.

I’m actually thinking about writing a second part of my article — not about whether LLMs will replace developers, but about what’s really happening inside large corporations and how managers see this shift from the top.

Collapse
 
benjamin_anderson_6b44d2f profile image
Benjamin Anderson

Yeah, if you are asking AI what to do, instead of carefully telling it what to do, you are using it wrong. This is why vibe coding can go horribly wrong. People who have no knowledge of the application they are asking to be written and don't check the code (even if they did, they wouldn't be able to follow it), is a terrible, terrible idea. The amount of nested conditions I have seen from Gemini and (yes) even Claude is embarrassing. It has gotten better, but you have to ask specifically, and check the results. If you don't, you are asking for trouble. Do NOT trust AI, but utilitize it as a tool to meet your needs properly. You will become 200% the developer you are now.

Collapse
 
jeff_li_d26c992d9320774ce profile image
Jeff Li

A senior dev + AI could likely handle most junior-level work.
But a junior dev + AI might also be able to cover parts of a senior role.
If that's the case, does it still make sense for organizations to keep both?
And if companies only want seniors, how does the next generation of seniors ever get created?

Collapse
 
empiree profile image
Oleg Dubovoi

Good question. In my opinion, even after the rise of AI, the roles are still basically the same:

A junior developer is proactive, brings fresh ideas, and is ready to put a lot of effort into the project. They are more adaptable and always follow trends, including new AI tools. They are also more cost-effective.

A senior developer is more pragmatic and forward-thinking. They can plan how to achieve bigger goals and manage the team’s work. Even with AI, a junior developer usually can’t design a system properly, and the company may end up paying for that later in time and money.

Of course, if we are talking about small projects like a Telegram bot or a parser, one developer might be enough. But in larger companies, each role shows its strengths in different ways.

Collapse
 
xshadowdeveloper profile image
xShadowDeveloper

I agree, kinda... AI still makes way to many mistakes to be considered a reliable partner (teacher) to achieve that. "how does the next generation of seniors ever get created?" that's a really good point... there might be outliers, we might become 100% depended on AI. If AI strikes we are toast :D ... But i think we are still very far from that future 🙃

Collapse
 
ddebajyati profile image
Debajyati Dey

most orgs don't seem to have the brains to see/understand this

Collapse
 
henry-johnson profile image
Henry Johnson

"Elon Musk predicts traditional coding will end by 2026" We'll see if he is right.

Collapse
 
xshadowdeveloper profile image
xShadowDeveloper

If that happens I'm going on an island fishing... end of 2026!? Maybe in 10 years. Considering how slow we are going. (Even if they say fast, I'm still waiting... it has been 5 years...)

Collapse
 
empiree profile image
Oleg Dubovoi

Let's see!

Collapse
 
relahconvert profile image
Bright Agbomado

I think AI is just a tool. decision still relies on the user. software developers/engineers will need to adapt

Collapse
 
empiree profile image
Oleg Dubovoi

That’s true. But unfortunately, right now people mostly fall into two groups: either “AI is a magic solution and can do anything,” or “AI writes bad code, and I’m slower with it than without it.”

But in the near future, AI adoption among software engineers will grow, and LLMs will become just another tool, like IntelliSense.

Collapse
 
pierre_louis_00a3552324ac profile image
Pierre Louis

This was a great read, thank you!

Collapse
 
empiree profile image
Oleg Dubovoi

Thank you very much! 😊

Collapse
 
terminaltools profile image
Stephano Kambeta

lets wait and see

Collapse
 
moneytopia profile image
Breaking The Habit

I think it will replace. But the final human verification is much needed.

Collapse
 
mfalmegriffin profile image
Mfalme

Read the article again. Writing code is not the main problem; making decisions is. :-D

Just my two cents. @empiree Nice take, I concur.

Collapse
 
ddebajyati profile image
Debajyati Dey • Edited

that's also ai is doing most places. most people prompt vaguely and are asking everything to ai which were their responsibilty to decide. so yeah it will likely replace every part where intelligence/thought/decision-making/architecting is needed.

very soon architect roles will also disappear because managers will replace them too with ai agents.

Collapse
 
bhavesh_kukreja profile image
Bhavesh Kukreja

Really insightful, summarizing all recent AI issues!
Thank you for the positivity this blog gives off :)

Collapse
 
empiree profile image
Oleg Dubovoi

Thank you! :)

Collapse
 
akironaggets profile image
Akiro Nava

Curious how others are seeing it in practice. Are you finding AI actually improving code quality long term, or just helping ship faster with more risk?

Collapse
 
empiree profile image
Oleg Dubovoi

In my opinion, everything depends on how you use AI. If your code already has clear guidelines, you control the architecture, and you let AI write only small pieces of code, you’ll be fine in the long run. But if you trust AI too much, especially with core business logic, then even if the code looks clean, it can hide a lot of potential issues.

Collapse
 
aniela_oprea_1167441bd930 profile image
Aniela Oprea

I don’t see AI completely replacing developers. I see it as becoming another tool we use, just like frameworks. It can improve speed, but it doesn’t replace judgment.

Collapse
 
marina_eremina profile image
Marina Eremina

Thanks for sharing real examples showing how being careless with AI tools can lead to mistakes. It really makes one double-check and rethink the AI-generated output!

Collapse
 
empiree profile image
Oleg Dubovoi

Thanks for your feedback!

Collapse
 
biasz_8f9642b850f8726cf73 profile image
BiasZ

Eventually it will, but hobby coding Will be a thing soooo.

Collapse
 
botanica_andina profile image
Botánica Andina

Good read. The agent reliability problem is real — I've been building autonomous SEO tools and the failure modes you describe are spot-on. State management is the hardest part.

Collapse
 
empiree profile image
Oleg Dubovoi

Thanks! Totally agree, state management is where things get tricky fast. Curious to hear what approaches worked best for you.

Collapse
 
mykyta_shkurba_4b7d55983e profile image
Mykyta Shkurba

Nice article, really clear and honest. AI is super helpful, but it’s not something you can fully trust on its own. In the end, people still need to think things through and take responsibility.

Collapse
 
empiree profile image
Oleg Dubovoi

Thanks!

Collapse
 
ddebajyati profile image
Debajyati Dey

sadly many people do all of this with ai nowadays -

  • Analysis
  • Discussion and briefing with the team
  • Architecture

People have started not to care about the human intution and many businesses have started forcing their developers to go full on ai, developing architectures based on AI's decision. they don't discuss with team much of the things because after using ai enormously from the last 2 years their brains don't make decision on their own but delegate those critical things to ai. something crashed? ask ai! something failed? ask ai! maybe changing the architecture will help? ask ai! add this integration? ask ai!

People have convinced themselves that whatever I will do, AI can do it better. I just need to give it more context. more context is all you need. feed all the logs and errors and code and eevrything to ai.

I just changed this part of our codebase. Why? Because, chatgpt recommended it! I just integrated this toolchain in our cdebase. Why? Because claude code recommended it!

People don't want to discuss their team. Why do that when you can do a LLM group chat. Just add proper context bro.

That is what happening in many startups, as I have read in many personal blogs of different developers.

Collapse
 
nea profile image
Savas

We need to be very self-aware: everything we do that is repetition or input creation can better be done by a machine. No matter if LLMs, agentic-AI coders or whatever. When I started programming I had the whole Brown Interrupt list printed besides me and was sketching Assembler code structure to implement.

High-level programming languages took that away (gladly ^^).
StackOverflow took away the publishing and reading in magazines and experimenting to some degree.
UI frameworks took away the necessity to create pixel-perfect-scalable user interfaces.

We are abstracting Software Engineering for ages already and elevating us all. Always has been, always will be.

The difference this time in my opinion is that this time, it's business driven. Non-technicals make decisions, create "visions" and affect "our" evolution in an unnecessary way. Why? It's the economy, stupid! :)

It is not a question of if AI can or will replace Software Developers.
It is a fight of Managers vs Spreadsheets vs Software... and we are loosing... for now.

The funny thing is, what any Spreadsheet wrangler does is actually best replaced by AI, but so far it is just dripping down to the creative process. And we haven't taken ownership enough to make it our own again and steer the good and eventual evolution.

I don't know the answer how to get out of this spiral but the world has gone crazy, and we are the ones being affected as have other areas before from industrial revolution to the internet age. Maybe it is supposed to happen. But I would like to think we just need to get together to give "AI" a meaning again and control the evolution to create benefits for all, not just creating more for less for nothing...

Collapse
 
dariusz_newecki_e35b0924c profile image
Dariusz Newecki

Solid article, Oleg — you nailed the realistic middle ground that too many people are missing right now.
You're right: it's not about AI replacing developers. It's about how we use it. The "vibe coding" wave of 2025 (and the real-world casualties like the Moonwell DeFi incident that cost ~$1.78M or the Deloitte report scandals) perfectly illustrate what happens when we treat LLMs as magic oracles instead of powerful but fallible tools.
LLMs are incredible at accelerating the mechanical parts — boilerplate, standard patterns, initial drafts — but they still lack true system understanding, accountability, and reliable handling of edge cases, business constraints, and long-term consequences. Scaling alone won't fix that fundamental gap (LeCun's point about needing real world models is spot on).
That said, I believe the next leap isn't just "be more careful with AI" or "always review everything manually." It's about building architectural guardrails so that AI agents can be truly useful without becoming dangerous.
For example, one approach I'm exploring is enforcing immutable constitutional rules at runtime: every AI-generated change must pass through structured stages (interpretation → planning → validation against core principles → execution) before it can affect the live system. This turns AI from "hands that might do anything" into "hands that can only act within defined, auditable boundaries."
In short: AI won't replace good developers. But developers who learn to govern AI properly (instead of just prompting it) will replace those who don't.
Keep pushing back against the hype — it's healthy. And thanks for highlighting the real risks with concrete examples.
What do you think the biggest missing piece is for making AI agents safe in production systems?

Collapse
 
gtanyware profile image
Info Comment hidden by post author - thread only accessible via permalink
Graham Trott • Edited

There's a problem at the heart of this. It's about change. There are 2 kinds:
1 - Change of scale. Things get bigger, faster, more complex.
2 - Change of kind. Things get different.

The general assumption is that AI brings mostly the first kind of change. "AI speeds up my React project by writing all the code for me; all I have to do is check it."

True, but it misses the point because it enters the process half-way. Why React? Why not vanilla JavaScript? Or even something simpler?

A modern programming language such as JavaScript has a huge surface area, built over years to help human programmers avoid making coding errors. It evolves constantly as new techniques and strategies become necessary or just more fashionable. But this is where AI has trouble. It has to decide which library is obsolete and how to use poorly documented build tools. It makes mistakes, known as hallucinations, and gets blamed for being unreliable, when the fault is really ours.

The fact is; if you ask AI to work with complex tools it will make mistakes. Give it simple tools and it won't. Unfortunately, simple tools are rare.

A change of kind starts at the beginning. What is the problem domain, and what is the simplest form of language that can express the needs of that domain? AI systems are designed to converse with humans using using human language, so why are we forcing them into the straitjacket that is React, etc? If the AI generates 200 lines of JavaScript with callbacks, closures, and async/await, you are trusting code you don't understand. When it breaks — and it will — you are helpless.

The solution starts with language. Many years ago I built an English-like scripting language, mainly to mask my own limitations as a programmer, and to my amazement and delight Claude Code picked it up instantly and is happy to use it as the lingua franca for all my coding projects. It rarely makes any mistakes, and when it does they are easily spotted and corrected. And surely I am not unique; others must be following a similar path.

So yes, for the great majority of projects AI will replace software developers. But it won't take away the need for human skills, just redirect them. I like the term "Software Architect", being someone who understands coding (as long as it's close to English) but more importantly, understands the problem domain. All the drudgery can be done by AI, but communication will be at the human level, not one that was built over decades for an entirely different purpose.

Collapse
 
tony_mathew_927e351a2b3d6 profile image
Tony Mathew

Clear and clever observations!

But you did not mentioned about the team size. That's were people are being scared off. Do companies need a 50+ member or more than that for a client project when llm is powering enough productivity ? Does it can be reduced to a 10 plus team size ? Well in that case those ones are replaced right . What about them?

Collapse
 
empiree profile image
Oleg Dubovoi

Thank you!

That makes sense. AI can handle a lot of routine tasks, so teams don’t need to be as big. But, at the same time, this makes development cheaper, which means more software will get built, and that could create more jobs.

Collapse
 
calqora profile image
Calqora

I don’t think AI will replace developers, but it will definitely reshape the role.

It’s great for speeding up development, especially when building practical tools, but human thinking is still key for solving real-world problems.

I’ve been building small utility tools recently, and AI helps a lot — but it still needs guidance.

Collapse
 
nadim_mahmud_f0cd62ed4d25 profile image
nadim mahmud

Thank you so much this article was great !

Collapse
 
shinn_ho_659064d55f13311a profile image
Shinn Ho

It's not about yes or no for this question. It's about when.

Collapse
 
empiree profile image
Oleg Dubovoi

If (or when) real artificial intelligence appears, it won’t just be software developers who need to worry, it will be everyone. 😄

Some comments have been hidden by the post's author - find out more