This would be great if said comprehension is reliable. But I've seen tools designed to "understand" and document repos hallucinate many times, often coming up with a plausible but completely wrong explanation of how things actually work, or, even more subtly, of why they work the way they work.
And while I could catch that because I wrote the code in question and know the answers to those questions, others do not have that benefit. The notion that someone new to the codebase - especially a relatively unexperienced dev - would have AI "documentation" as a starting point is honestly quite terrifying, and I don't see how it could possibly end with anything other than garbage out.
And while I could catch that because I wrote the code in question and know the answers to those questions, others do not have that benefit. The notion that someone new to the codebase - especially a relatively unexperienced dev - would have AI "documentation" as a starting point is honestly quite terrifying, and I don't see how it could possibly end with anything other than garbage out.