Hello. I know this isn’t completely related to Linux, but I was still curious about it.
I’ve been looking at Linux laptops and one that caught my eye from Tuxedo had 13 hours of battery life on idle, or 9 hours of browsing the web. The thing is, that device had a 3k display.
My question is, as someone used to 1080p and someone that always tries to maximise the battery life out of a laptop, would downscaling the display be helpful? And if so, is it even worth it, or are the benefits too small to notice?
You’ll have to downscale the resolution unless you have super human vision. I suspect that the laptop is configured to ~150% ootb which would mean those battery estimates are based on that as well.
I don’t think upscaling the text/UI and downscaling the whole screen are the same thing.
No, the majority of the energy consumption is in the backlight.
Yes, but by very little.
You’re saving on GPU processing, but that’s unlikely to be that much for browsing.
Maybe if it allowed you to switch to integrated graphics versus discrete, putting the GPU to sleep.
For just browsing, even integrated graphics has been plenty since the beginning of the internet, maybe with some exceptions when Flash gaming reached its pinnacle.
That might save a bit of power, but your dedicated GPU is usually in an idle/powered down state until your compositor gives it specific applications to accelerate. for Nvidia laptops this is what the PRIME/Optimus feature does.
Using the iGPU might save power but the resolution doesn’t need to be turned down for that
I’d think so. 3k is so many pixels to compute and send 60 times a second.
But this video says the effect on battery life in their test was like 6%, going from 4k to 800x600. I can imagine that some screens are better at saving power when running at lower resolutions… but what screen manufacturer would optimize energy consumption for anything but maximum resolution? 🤔 I guess the computation of the pixels isn’t much compared to the expense of having those physical dots. But maybe if your web browser was ray-traced? … ?!
Also, if you take a 2880x1800 screen and divide by 2 (to avoid fractional scaling), you get 1440x900 (this is not 1440p), which is a little closer to 720p than 1080p.
Your GPU doesn’t need to re-render your entire screen every frame. Your compositor will only send regions of the screen that change for rendering, and most application stacks are very efficient with laying out elements to limit the work needed.
At higher resolutions those regions will obviously be larger, but they’ll still take up roughly the same % of the screen space.