This discussion came up recently, but my stance still is that if you know for sure at compile time that some code always results in UB or is just non-conforming, you should not fuck around and just remove a big chunk of code because the standard allows you so, but either simply refuse compilation (my preference) or at least still just emit code representing what was written (effectively the implementation defined behavior route).
Sure, and compilers do now sometimes emit warnings like "shift amount is always out of range".
Overall though, this isn't how compilers work: they aren't "fucking around" with code they know for sure is broken and will be executed. They are just applying a large series of optimization passes which includes passes like removing unreachable code, and that interacts with an analysis of paths that cannot be take for various reasons (including that end in UB) to remove "big chunks of code".
The same passes that screw you over in any of these famous examples are the ones that help code generation in the 99% of remaining cases including many "obvious" optimizations that you'd want the compiler to perform.
I know the situation with undefined behavior is distressing and the examples make it look like the compiler writers are out to get you, but that's not really the case.