← Back to blog

The Renovation Problem

Header image for The Renovation Problem

Back when I first became a team lead I assumed I’d tell folk what to do and they’d just do it. It’d be great - and easy. But it didn’t work out like that. I quickly discovered that people are, well, different. They have individual motivations and desires. Which were different from mine.

It took me time to realize that successful leadership requires understanding motivations. Only then can you start to work out how to correctly incentive the work - and get your team aligned with your goals. Understanding motivations is hard. People don’t offer up their motivations freely. Often they don’t even understand them properly themselves. It takes time. Talking. Getting to know people well.

One useful model when thinking about motivations is to split them into extrinsic and intrinsic. Extrinsic are the things we do for external rewards - salary, promotion, avoiding getting fired. Intrinsic are the things we do because we find them rewarding. Building code because you enjoy it, perhaps. Intrinsic are much more powerful, but harder to create. They require autonomy, mastery, purpose. Many managers miss this; they think motivation is about finding the right carrot or stick. But it’s really about understanding each individual’s intrinsic motivations and tailoring incentives around those.

What does this have to do with AI?

Now consider how AI affects software engineering orgs. Let’s split a hypothetical org into three layers - ICs, middle managers and a leadership team.

By now many ICs have realised AI might take their job. It’ll almost certainly take some of their colleagues’ jobs. The bits of the work they actually enjoy - the problem-solving, debugging, technical mastery - those are exactly the bits AI is coming for first. And the reskilling required? It’s substantial and unclear. Not everyone will make it. The only clear motivator is fear: use it or lose your job. Not exactly autonomy, mastery and purpose.

Middle managers have an additional problem. Their worth is legible in headcount. “I manage fifty engineers” means something at a dinner party, on a CV. “I run a team of a hundred instances of Codex” doesn’t have quite the same social cachet (yet). Even if your new team is genuinely more productive, the respect from society hasn’t caught up. You’ve done the right thing but somehow end up looking less competent.

And leadership? They should be the obvious winners here. More productive teams, lower costs, faster delivery. But even CEOs measure themselves by the size of the thing they run. A company of 5,000 feels more significant than a company of 500 making the same money. There’s a reason “we’re hiring aggressively” is always the press release, never “we’ve figured out how to do more with less.” Growth is the story that raises money. That gets you on glossy magazine covers.

And so?

Getting legacy software development organizations to successfully adopt AI is shaping up to be extremely hard. Many leaders are adopting extrinsic motivations - some even as extreme as “use AI or lose your job.” There are few, if any, good intrinsic motivations for existing engineers.

Over Christmas I was talking to a structural engineering friend. They were telling the story of how sometimes old buildings just weren’t fit for purpose anymore. Despite how attached we might be to them, it was too difficult to adapt them. Too expensive. Too time consuming. The right thing was to tear them down and build anew.

My friend loved old buildings. But they’d learned to recognise when preservation became sentimentality. When people risked spending twice the money to get half the result, just because they couldn’t let go.

I think about that when I look at software organisations. Layers of process, decades of accumulated culture. Built for a world where humans were the only option. You can’t just swap in AI agents and expect it to work. The resistance isn’t irrational - it’s the building telling you it wasn’t designed for this.

How many software companies have declared “our employees are our biggest asset”. And they were right. All that distilled knowledge and technical ability. Writing code was hard and there weren’t many humans who could do it well. But now? Those staff may turn out to be the biggest liability. Their resistance to change, inability to adapt. Managers not willing to relinquish their empires. What happens when skills are suddenly devalued overnight?

Maybe the companies that win won’t be the ones that successfully transform. Maybe the winners will be the ones that start fresh, unencumbered. That’s uncomfortable if you’re inside a legacy org. And probably even more so if you’re running one.


Originally published on Martin Davidson’s Substack. Follow Martin for more on AI and software engineering.