Hello. I know this isn’t completely related to Linux, but I was still curious about it.
I’ve been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.
My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?
I’d think so. 3k is so many pixels to compute and send 60 times a second.
But this video says the effect on battery life in their test was like 6%, going from 4k to 800x600. I can imagine that some screens are better at saving power when running at lower resolutions… but what screen manufacturer would optimize energy consumption for anything but maximum resolution? 🤔 I guess the computation of the pixels isn’t much compared to the expense of having those physical dots. But maybe if your web browser was ray-traced? … ?!
Also, if you take a 2880x1800 screen and divide by 2 (to avoid fractional scaling), you get 1440x900 (this is not 1440p), which is a little closer to 720p than 1080p.
But you don’t lower the amount of pixels you use. You just up the amount of pixels used to display a “pixel” when lowering the resolution. So the same amount of power is going to be used to turn those pixels on.
Your GPU doesn’t need to re-render your entire screen every frame. Your compositor will only send regions of the screen that change for rendering, and most application stacks are very efficient with laying out elements to limit the work needed.
At higher resolutions those regions will obviously be larger, but they’ll still take up roughly the same % of the screen space.