Because 100% of implementations are differently shitty. There's no amount of "pushing browsers to fix things" that is going to catch 100% of novel interactions resulting from different combinations of the declarative HTML and CSS languages out in the wild (especially when JavaScript then comes along and moves all those declarations around anyway).
Sure, and the browsers that stray further off will get used less and die off.
And if you are using the latest and "greatest" JS features, you have to expect the failures that happen. If you enjoy sitting on the bleeding edge, don't complain about getting cut.
If you implement features using known, simple and stable tech, things will generally work great without needing to worry about special cases.