Which is why we started saying "whoa, slow down" when it came to some particular artifacts, such as nuclear weapons as to avoid the 'worse than we can imagine' scenario.
Of course this is much more difficult when it comes to software, and very few serious people think the idea of a ever present government monitoring your software would be a better option then reckless AI development.
The best we can ever hope to do is find mitigations as and when problems arise.