Notice how they don’t post the bitrate, because even the higher one will be extremely low. Every streaming service has been dropping their bitrates over the years, Netflix and HBO are the worst offenders as I’ve noticed. It probably saves them a ton of money, and 90% of their customers won’t notice because they’re on their phone while watching in the background.
To make it weirder, I’m confident they boost the bitrates on their new releases to get the approval of the enthusiastic viewers, then drop it after the reviews are in.
So the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.
I’m highly skeptical of the claim that streaming services would have intentionally dropped their bitrates at the expense of perceived quality. There’s definitely research going on to deliver the same amount of perceived quality at lower average bitrates through variable bitrate encodings and so on, but this is sophisticated research where perceived quality is carefully controlled for.
It probably saves them a ton of money, and 90% of their customers won’t notice because they’re on their phone while watching in the background.
So this is fundamentally not how video streaming works, and I think this is important for the average person to learn - if you stream a video in the background or with your screen turned off, video data will stop loading. There’s literally no point in continuing to fetch the video track if it’s not being rendered. It would be like downloading the audio track for French when the user is watching with the English track turned on, i.e. nonsensical.
This subsequently removes this as a possible reason for any video streamer intentionally reducing their bitrate, as the savings would not be materialized for background playback.
To make it weirder, I’m confident they boost the bitrates on their new releases to get the approval of the enthusiastic viewers, then drop it after the reviews are in.
Depending on the usage patterns for the platform in question, this probably doesn’t make sense either.
the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.
But they post resolutions, which are arguably less interesting. The “general public” has been taught to use resolution as a proxy of quality. For TVs and other screens this is mostly true, but for video it isn’t the best metric (lossless video aside).
Bitrate is probably a better metric but even then it isn’t great. Different codes and encoding settings can result in much better quality at the same bitrate. But I think in most cases it correlates better with quality than resolution does.
The ideal metric would probably be some sort of actual quality metric, but none of these are perfect either. Maybe we should just go back to Low/Med/High for quality descriptions.
I think resolution comes with an advantage over posting bitrates - in any scenario where you’re rendering a lower resolution video on a higher resolution surface, there will be scaling with all of its negative consequences on perceived quality. I imagine there’s also an intuitive sense of larger resolution = higher bitrate (necessarily, to capture the additional information).