You are viewing a single thread.
View all comments View context
2 points
*

You don’t get this attitude about Macs? Are you willfully blind?

Plug a 1080p monitor into a Windows or Linux machine and notice how text is crisp and readable, because they use sub-pixel text rendering, a technique in use for decades to make text readable on lower resolution monitors.

Now plug that monitor into a MacOS computer and notice the text looks like trash because Apple ripped out their sub-pixel text rendering system to force users to buy their fancy high res monitors.

permalink
report
parent
reply
1 point

Font rendering on Linux is still hit and miss. Recently had to troubleshoot an issue where only the titles of Wikipedia articles in Flatpak Firefox on OpenSUSE looked like ass, with other text, or all text in other browsers and another distro rendering OK.

permalink
report
parent
reply
4 points

I don’t actually own a 1080p monitor (nor an apple one), and that’s a pretty specific reason to hate macs of high resolution is your desire. I’m sure there are no similar issues with other platforms that someone could find as a reason to [presumably] turn their PCs into ewaste- which is the actual topic of this thread.

Hyperbolic much?

From another thread on this topic:

Even Microsoft themselves are moving away from it. They just left it on Windows as is for those who use old, standard-res LCD. Their subpixel antialiasing (ClearType) has been disabled by default on Microsoft Office (and many of their productivity products) for years.

The reason why they are moving away from subpixel antialiasing is because, the sole reason for it exist is for the shortcoming of standard LCD, where it has a big “pixel” that consist of row of RGB “subpixel”. Say if you want to draw a line of 1.5px, obviously you can’t divide that pixel in half. What people did was by using some of the “subpixel” to made up that 0.5px (e.g. it’ll only light up the blue subpixel if the 0.5px is to the left, or conversely the red subpixel if it’s tho the right). Here is an example. By using subpixel rendering on standard LCD, you can “fool” the user by adding that extra colour on the side, which when viewed on standard LCD, it will look smooth rather than those jagged colour.

Now, obviously this “illusion” will only work on display with big pixel consist of (in order) red, green, and blue subpixel. Now, since many people are moving away toward high resolution display (Apple’s main reason) and there are many other display type with different subpixel arrangements (Microsoft’s main reason, and also Apple’s with their OLED products), there is no reason to use subpixel rendering anymore (in fact, using it on any display other than LCD will look worse).

permalink
report
parent
reply
4 points
*

I don’t actually own a 1080p monitor (nor an apple one), and that’s a pretty specific reason to hate macs of high resolution is your desire.

No it is one example amongst hundreds of Apple not prioritizing backwards compatibility or even just third party compatibility, because it would be a little extra effort for a couple software engineers, and as a result we get piles and piles of physical e-waste.

As a company Apple takes no responsibility for their role in compatibility and ensuring that our (society’s) broad ecosystem of products keeps functioning, they only put effort into making sure that their products, that they profit off of, work and keep working.

permalink
report
parent
reply
-1 points

A little extra effort times “hundreds” of examples is a lot of extra effort…

Okay then. Thanks for your viewpoint.

permalink
report
parent
reply