They can't really on LEDs, though on OLED black is cheaper than other colors.
But unnecessary animations (like blinking text cursors) cause the entire graphics hardware stack to wake up all the way to the display, which is a serious power cost.
The original CGA/EGA/VGA text modes had a hardware cursor blinking option; I find blinking cursors useful, so it'd be interesting if modern GPUs added a low-power animation loop feature.
Their power cost is actually a pretty recent development. It used to be that it didn't matter because the display refreshed every frame no matter what, but in the last few years laptop displays added panel self-refresh. If the screen becomes static for a little while, the host side can stop driving it entirely and it'll keep the same picture up.
I'd guess it wouldn't be hard to keep two frames in memory, so you could get one blink at a time.
Except when it is transitioning color temperatures, I don't see why this would be the case. I reversed f.lux and it just calls some UpdateRGBLUT type function based on the color temperature. I think this LUT is applied to the frame buffer (perhaps this is how color profiles work?) even if f.lux is not being used.
First, f.lux periodically adjusts the LUT, so it does call this fairly often. Secondly, in my experience, adding this functionality seriously increases the odds that my laptop will switch from using the Intel GPU to the Nvidia GPU.