Somewhere in the power supply, power is converted via a switched inductance, the power transfer is controlled by the switching frequency, with toughly the same energy transferred for each cycle (constant I_peak). Losses are roughly f*I_peak^2. There are several factors limiting maximum f, so to get a bigger maximum power, bigger power supplies have bigger I_peak. Thus, for the same transferred power, a bigger supply will have bigger I_peak and smaller f, but since I_peak appears squared, the loss is worse.
(This is the rough explanation, the devil is in the details)
Additionally, many power supplies are worse at load regulation if their load is too low.
Also PSUs have an efficiency, i.e. only 80% of consumed energy goes into the components (it's available), the rest goes as toaster/heater I guess. There are different standards it seems gold, platinum etc.
It's the first time I can hear that oversized PSU leads to more power consumption - how come?