Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This 16 inch laptop has 2560x1600 resolution. If you set it to 200% then that's the equivalent of 800 pixels which is what you expect in a 13 inch laptop. For a 16 inch laptop there's not enough pixels.

Why set it to 200%? Because as Steve Jobs has explained, the only resolution that looks good after 100% is 200%. Then 400%. If you set it any in-between scale (such as 150% or 300%) then you will have display artifacts, such as horizontal lines appearing to have different widths when they are all in fact set to 1px.



> If you set it any in-between scale (such as 150% or 300%) then you will have display artifacts, such as horizontal lines appearing to have different widths when they are all in fact set to 1px.

This is only true with the approach macOS takes. When set to 150% on macOS the app renders at 200% and the compositor downscales. On Windows however there is no downscaling: the app renders directly at 150% thus avoiding any artifacts.


That is only true for apps where everything is vectors and it knows about scaling, since apps draw to the pixel buffer themselves. Many contain rasterized resources with fixed resolutions.


Weird... I have only ever seen this artifact on Windows machines, never on a Mac.


Fortunately on Windows and Linux you don't have to scale the same way as Apple does pixel doubling everything. And even Apple has shipped laptops where the default display resolution is not a integer scale of the panel resolution (e.g 12" MacBook).


I have been to BestBuy and tried out laptops with non-integer scaling (150% or even 300%) and observed the display artifact I mentioned previously.


There are no artifacts at 300% resolution – that's exactly 3x3 real pixels per CSS pixel.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: