Tell that to the new programmer who builds a piece of software using this creating an absolute mess.
The danger here isn't with experienced developers (this is, obviously, a tool with great potential for productivity). It's with people who just blindly trust what the robot spits out.
Once code like that is implemented in a mission-critical system without discernment, all hell will break loose.
Tell that to the HR departments responsible for hiring developers at major companies.
Not hiring subpar developers, especially in a massive company isn't a matter of "if" but "when." And it only takes one screw up to crash an airplane because of software.
And guess who massive companies trust for their technology?
What is the hard boundary between you and your AI companion?
What happens when people think their "self driving" cars are more capable than they actually are? Many times, you are better off in a dumb car, because your expectations are such that you have to pay attention to a continuous stream of events and respond in a stateful manner.
If you try to bootstrap a holistic understanding of your problem in between bursts of auto-generated code, I don't think you are going to have a fantastic time of it.