Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I could have sworn I read somewhere that Activity Monitor also included GPU usage in its measurements—naturally, using the GPU more uses more power/battery.


It does. I found a list of factors and posted it as a comment on the blog post:

https://blog.mozilla.org/nnethercote/2015/08/26/what-does-th...

In the data I posted, which is from /usr/share/pmenergy/default.plist, kgpu_time is 0.0, but in all the other plists it's nonzero.


Oh wow, nice find. It looks like that calculation is pretty comprehensive, then.


I've updated the post to incorporate your findings. Thank you!


I imagine that other things like WiFi usage, disk usage, and possibly even port usage could be factored in as well.


I'd also guess backlight.


Can apps control backlight level? I can't recall using one that's ever tried.


They can't really on LEDs, though on OLED black is cheaper than other colors.

But unnecessary animations (like blinking text cursors) cause the entire graphics hardware stack to wake up all the way to the display, which is a serious power cost.


The original CGA/EGA/VGA text modes had a hardware cursor blinking option; I find blinking cursors useful, so it'd be interesting if modern GPUs added a low-power animation loop feature.


Their power cost is actually a pretty recent development. It used to be that it didn't matter because the display refreshed every frame no matter what, but in the last few years laptop displays added panel self-refresh. If the screen becomes static for a little while, the host side can stop driving it entirely and it'll keep the same picture up.

I'd guess it wouldn't be hard to keep two frames in memory, so you could get one blink at a time.


What about f.lux?


color temperature, not backlight level.


...and ironically the way f.lux works means it stresses out the GPU more when it is adjusting colour temperature, so...


Except when it is transitioning color temperatures, I don't see why this would be the case. I reversed f.lux and it just calls some UpdateRGBLUT type function based on the color temperature. I think this LUT is applied to the frame buffer (perhaps this is how color profiles work?) even if f.lux is not being used.


First, f.lux periodically adjusts the LUT, so it does call this fairly often. Secondly, in my experience, adding this functionality seriously increases the odds that my laptop will switch from using the Intel GPU to the Nvidia GPU.


Are you sure? I use redshift and it can control the backlight in addition to the color I believe.


Or possibly saves power, if work is being offloaded from the CPU.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: