29 March 2026
Scar Tissue — What AI-Assisted Development Can't Yet Replace
A former colleague's comment stopped me in my tracks. On operational experience, hard-won instincts, and what a nine-phase process can and can't encode.
A former colleague left a comment on my last post that deserves more than a reply. He has a deep ops and infra background — user support to sys admin to infrastructure to architecture — and is now Head of IT at a national regulator in Ireland. His comment was measured, thoughtful, and landed harder than most things I’ve read about AI and the future of technology work:
“I think currently we benefit from people who have cross discipline experience. Take away the emotional and physical scars of that frontline service and I’m not so sure where we end up. It’s a brave new world.”
I’ve been sitting with that comment since. Because he’s right. And I say that as someone who started exactly where he describes.
Where I started
Before I became an enterprise architect, I was on the frontline. I carried a laptop in the car at weekends. I got pings in the night that woke me up and didn’t let me back to sleep. I spent hours trying to get systems back up while users were waiting and managers were asking questions I couldn’t yet answer.
That experience gave me something I still use every day. Not just technical knowledge of how systems fail — though that matters — but a value system. The understanding, forged under real pressure, that operability and stability are a must-have. Not an afterthought. Not a nice-to-have. A non-negotiable that gets designed in from the start because you know what it costs when it isn’t.
You don’t learn that from a textbook or a process document. You learn it at 2am in a car park when something is broken and it’s your problem to fix and the only person who can help you is yourself.
The honest question
So when I think about AI-assisted development and the next generation of builders — people who may never carry that laptop, never get those pings — I have to ask an honest question: does my nine-phase process solve this?
The process forces decisions that inexperienced builders typically skip. It includes a SECURITY.md, explicit architecture decisions, non-functional requirements, operability and stability considerations built in from the start. In that sense it encodes hard-won wisdom and makes it available to anyone willing to follow it rigorously. It raises the floor significantly.
But here’s where I have to be honest: it doesn’t replace the ceiling that operational experience builds.
The difference between what and why
A checklist can tell you what to include. It can’t tell you why it matters in your bones.
Someone following the nine-phase process without operational experience might tick the stability and operability boxes. They might include the right documents and ask the right questions. But will they feel the weight of those decisions the way someone does who has been personally accountable for a system failing in the real world? Who has watched the consequences ripple out — the users affected, the trust damaged, the weekend lost?
There’s also a subtler problem. Operational wisdom doesn’t only live in what you write down. It lives in the instinctive “something feels off about this” that comes from having seen similar patterns fail before. The gut check that fires before you can articulate why. That’s harder to encode in a template — possibly impossible.
The nine-phase process, in my hands, is effective because I know what I’m encoding and why. Every document, every constraint, every guardrail has a reason behind it that I can trace back to something I’ve seen go wrong. In the hands of someone without that background it’s better than nothing — possibly much better — but it isn’t the same thing.
What following best practice actually requires
There’s another dimension to this worth naming. Following availability patterns and best practices in AI-assisted development will get you far — but only if you know those patterns exist and are explicit about applying them when designing and building. That knowledge and that discipline don’t emerge automatically from using an AI tool. They have to be brought to the process.
This is sometimes lost when building with an agent. The AI will do what you ask. If you don’t ask for operability, you won’t get it prioritised. If you don’t specify stability requirements, they won’t appear. The agent doesn’t carry scar tissue. It doesn’t know what 2am feels like. It optimises for what you define — which means the quality of what you define is everything.
The open question
I don’t have a clean answer to what my former colleague is really asking. Is there a new path to building those instincts in an AI-assisted world? Can you develop the right intuition without the traditional frontline experience that forged it in previous generations?
Perhaps the path looks different but leads somewhere similar — more deliberate, more structured, less accidental. Perhaps the nine-phase process and the discipline of building with explicit constraints is a new kind of apprenticeship. Or perhaps something genuinely gets lost, and we’ll only know what it is when we miss it.
What I’m confident about is this: the people who will navigate this well are the ones who bring genuine operational humility to the work. Who treat the process as a serious discipline rather than a box-ticking exercise. Who ask not just “does this work?” but “what happens when this fails?”
That instinct — wherever it comes from — is still the most important thing a builder can have.
Brave new world indeed.