I think you have the consequences of AI exactly backwards. AI provides virtual headcount and will vastly increase the ability of small teams to manage sprawling codebases. LLM context lengths are already on the order of millions of tokens. It takes a human days of work to come to grips with a codebase an LLM can grok in two seconds.
The cost of working with code is much lower with LLMs than with humans and it's falling by an order of magnitude every year.
So if you've got a data object, defined in multiple places in a sprawling codebase, that you want to change, are you going to trust the LLM to find them all, and not miss a single one?
> Why is your data object defined in multiple places in your codebase?
Because that's the negation of my premise which you disagreed with: "Keeping to the DRY principle is also more valuable in the age of AI when briefer codebases use up fewer LLM tokens."
> And why aren't you using your IDE to change them all at once?
It sounds like you're assuming that they're all defined in the same way that you can catch them with a search.
The cost of working with code is much lower with LLMs than with humans and it's falling by an order of magnitude every year.