That was why it was done that way in languages developed in the 1970s. But, as kerkeslager points out, that logic doesn't work for an enterprise programming language from the 2000s.
We don't actually need to speculate on this. Don Syme has explicitly said that this was a deliberate language design decision meant to discourage the big ball of mud antipattern. And the language maintainers continue to cite this as the reason why they don't change this behavior even though they easily could.
Single pass compilation can support backreferences, i.e. referencing a symbol and then defining it later, efficiently with a technique called "backpatching". All single-pass compilers I know of use backpatching for computing jumps--I'm not even aware of any other way to compute jumps. For symbols, the implementation of backpatching is a bit more complex, but it's a pretty well-known solution and I don't think it would be a significant barrier for any competent compiler developer. That is to say, if they've chosen to not support backreferences, it's not because it's hard to do in a single-pass compiler.
EDIT: The wonderful book Crafting Interpreters has an implementation of backpatching jumps to implement loops. Before anyone says "this is an interpreter, not a compiler", be aware that most modern interpreters contain a compiler. https://craftinginterpreters.com/jumping-back-and-forth.html
Sure, we could do some craft to work around single pass limitations, but that was Don Syme decision from long time ago, and F# folks are faithful to it.
Actually liked language a lot, thought it could replace Python as top level AI/ML tool