Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He's saying he may upgrade to GTX 1080ti - that's extra 250W.

It's the first time I can hear that oversized PSU leads to more power consumption - how come?



Somewhere in the power supply, power is converted via a switched inductance, the power transfer is controlled by the switching frequency, with toughly the same energy transferred for each cycle (constant I_peak). Losses are roughly f*I_peak^2. There are several factors limiting maximum f, so to get a bigger maximum power, bigger power supplies have bigger I_peak. Thus, for the same transferred power, a bigger supply will have bigger I_peak and smaller f, but since I_peak appears squared, the loss is worse. (This is the rough explanation, the devil is in the details)

Additionally, many power supplies are worse at load regulation if their load is too low.


I just looked at corsair's efficiency graph and it seem that the best efficiency is around 50% of load on their PSUs.

Additionally up to around 35% load the fan is off, passive cooling is enough.


From what I'm reading, the 1080ti and 280x both fall right before 250W, so the change shouldn't be significant.


Also PSUs have an efficiency, i.e. only 80% of consumed energy goes into the components (it's available), the rest goes as toaster/heater I guess. There are different standards it seems gold, platinum etc.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: