I have a very slow Internet connection (5 Mbps down, and even less for upload). Given that, I always download movies at 720p, since they have low file size, which means I can download them more quickly. Also, I don’t notice much of a difference between 1080p and 720p. As for 4K, because I don’t have a screen that can display 4K, I consider it to be one of the biggest disk space wasters.
Am I the only one who has this opinion?
You’re not alone.
On a good large screen, 1080p is a noticeable upgrade from 720p.
But the distance you’d have to sit at, to get much out of 2160p over 1080p, is just way too close.
However the High Dynamic Range that comes with 4K formats and releases IS a big difference.
On the other hand, storage is pretty cheep. A couple cents per GB really.
But you’re talking more about bandwidth, which can be expensive.
But yeah. You’re not alone.
You don’t really prefer a lower resolution, you just work within the limitations you have.
Also, I don’t notice much of a difference between 1080p and 720p
Either your display is really shitty or you need (better) glasses. This isn’t like the difference between 60 and 144hz where its barely visible for untrained eyes.
Completely true, but also compression can make anything bad. I’ve seen 480p better 1080p simply because the 480p was using more bitrate, where the 1080p is encoded without enough relatively speaking.
To be fair, resolution is not enough to measure quality. The bitrate plays a huge role. You can have a high resolution video looking worse than a lower resolution one if the lower one has a higher bitrate. In general, many videos online claim to be 1080p but still look like garbage because of the low bitrate (e.g. like on YouTube or so). If you go for a high bitrate video, you should be able to tell pretty easily, the hair, the fabric, the skin details, the grass, everything can be noticeably sharper and crisper.
Edit: so yeah, I agree with you, because often they are both of low bitrate…
Great wizard of the bitrates, grant me your wisdom…
I can’t wrap my head around bitrate - if I have a full hd monitor and the media is in full hd then how is it that the rate of bits can make so much difference?
If each frame in the media contains the exact 1920 × 1080 pixels beamed into their respective positions in the display then how can there be a difference, does it have to do something with compression?
Exactly, this is about compression. Just imagine a full HD image, 1920x1080, with 8 bits of colors for each of the 3 RGB channels. That would lead to 1920x1080x8x3 = 49 766 400 bits, or roughly 50Mb (or roughly 6MB). This is uncompressed. Now imagine a video, at 24 frames per second (typical for movies), that’s almost 1200 Mb/second. For a 1h30 movie, that would be an immense amount of storage, just compute it :)
To solve this, movies are compressed (encoded). There are two types, lossless (where the information is exact and no quality loss is resulted) and lossy (where quality is degraded). It is common to use lossy compression because it is what leads to the most storage savings. For a given compression algorithms, the less bandwidth you allow the algorithm, the more it has to sacrifice video quality to meet your requirements. And this is what bitrate is referring to.
Of note: different compression algorithms are more or less effective at storing data within the same file size. AV1 for instance, will allow for significantly higher video quality than h264, at the same file size (or bitrate).
If each frame in the media contains the exact 1920 × 1080 pixels …
This image has the same number of pixels on the top and bottom half, but you can probably see the bottom half looks worse. That’s what lower bitrate does. It’s like turning up the compression on a jpg – you are not getting the exact same pixels, just the exact same image size.
Simple explanation, the higher the bitrate, the more data is dedicated to each frame to be displayed, so the higher the quality of each frame assuming the same resolution. This means fewer artifacts/less blocking, less color banding, etc.
Lower bitrate is the opposite, basically. The video is more compressed, and in the process it throws out as much information as possible while trying to maintain acceptable quality. The lower the bitrate, the more information is thrown out for the sake of a smaller filesize.
Resolution is the biggest factor that affects picture quality at the same bitrate. A 1080p video has a quarter of the resolution of a 2160p video, so it takes much less data to maintain a high quality picture.
Yes, every video you download or stream is actually compressed quite a lot, the bitrate just determines how much compression is applied. Higher bitrate means the file is bigger and less compression is done, while low bitrate means the video has a lot less bits to store all that data and so has to do more compression.
Here’s my twisted life exposed…I have no issue watching 1080p on my QLED 4K TV. I game at 1080p happily, I honestly don’t give a shit about 4K content.
1080p looks good enough for me, and I actually watch 720p on my phone screen half the time too.
And not because of lack of speed, I have a 1Gbps+ fiber line up and down.
And tbh, if it means I get to own and control my media, I would tolerate even worse quality if that’s what I needed to do.
Grunge computing ftw! Quality at the cost of your soul? Fuck that!
That’s less of an opinion and more of a hardware restriction, isn’t it?
If I had a 5 Mbps connection or no display that can display 4k, I also would not download in 4k.