Subpixel antialiasing Achilles heel

Some people are complaining about some Mac OS X applications which do not follow their preference for subpixel anti-aliasing. Pierre Igot thinks this is due to incompetence. Michael Tsai speculates that there is a hidden issue that makes Quartz choose standard antialiasing. John Gruber just find this curious. And I know exactly why this happens and why it isn’t easy to fix.

Michael Tsai thinks this is linked to off-screen buffers which then got composited together:

The pattern I see is that applications like Pages and OmniGraffle draw multiple layers with possibly overlapping elements. A logical way to implement this is to draw the elements separately into off-screen buffers and then composite them together.

And he is exactly right. The problem is that these buffers have a transparent background on which Quartz never apply subpixel antialiasing. And that’s not because of incompetence, it is because it is simply impossible to draw subpixel antialiasing on a transparent background. The culprit is called aRGB, can you see it?

If not, here is the explanation.

Subpixel antialiasing works by giving different values to the three subpixels inside a pixel (red, green, blue) based on how far each of them are from the mathematical outline of the character’s glyph. When drawing, the compositor must apply each subpixel’s values with a different level of transparency.

The “a” in aRGB stands for alpha — it’s the transparency value for the whole pixel. That’s it, the whole pixel: when aRGB pixels are later composited with other layers, each subpixel is mixed with the same level of transparency. This is incompatible with subpixel antialiasing which requires a different transparency level for each of the three subpixels.

That said, I think Quartz is pretty smart because subpixel antialiasing works on semitransparent backgrounds, like menus pulling out from the menu bar. It implements some sort of gradual degradation of the subpixels. For instance, if the background on which it draw text has a transparency of 50%, half of each subpixel value will be standard antialiasing, the other half being subpixel antialiasing.

So if an application draws in an offscreen buffer with a transparent background — surprise! — no subpixel antialiasing. Until Apple implements some kind of aRaGaB compositor, application developers will have to choose between a fast application with off-screen rendering or subpixel antialiasing, but not both. I think the choice is easy to make for developers.


  1. A hypothetical aRaGaB compositor wouldn’t have acceleration from the graphic card anyway (graphic cards use aRGB); proper subpixel transparency may not be enough of an improvement to justify a slower and more RAM-hungry compositor.

  2. Subpixel antialiasing does not work for shadowy text either probably because Quartz internals render the text in an off-screen buffer to compute the shadow.

  3. You can test and see partial subpixel antialiasing for semitransparent backgrounds by playing with the transparency of a terminal window over another window of the same color. When the background becomes completely transparent, you’ll see standard anti-aliasing.



Why does the transparency have to change for each subpixel? Isn’t it enough to stick with a single alpha value for the whole pixel and just change the RGB values?

Mark Thomas

This is all very convincing, but in Pages there’s a base layer of text where you do most of your typing — the word processing layer. Why can’t this layer use sub-pixel anti-aliasing? Yes, spoken like a clueless user, but hey. Doesn’t hurt to ask.

Michel Fortin

If you write gray text on a white background for instance, subpixel antialiasing will make edges orange on the left and blue on the right. The same text on a black background will have the colors reversed: blue on the left and orange on the right. Choosing the right RGB values depends on the background color and you don’t know that color in advance on a transparent background. You can either assume a predefined background color and get ugly results where the background color doesn’t match, or revert to standard antialiasing.

Mark, that is actually a pretty good question. I think in Pages you can put pictures underneath the text. This would justify the use of a transparent layer for the main text. That way, when you move/change the background image it doesn’t have to redraw the text on the page, resulting in smoother direct manipulation of the images.

Pierre Igot

Based on this comment on my post:

subpixel anti-aliasing works just fine with text transparency in Safari. This page:

illustrates it with varying degrees of opacity over a solid colour background. There’s definitely subpixel anti-aliasing there.

This seems to contradict what you are saying, Michel, unless we are misunderstanding something. (I am still confused by the issue of transparency—or opacity—of the text itself vs. transparency of the background.)

Michel Fortin

Transparent text — or more precisely semitransparent text — pose absolutely no problem to subpixel antialiasing. Transparent backgrounds does.

For performance reasons, Pages and some other applications render each layer separately and then combine them together. This middle step involving rendering layers off-screen in aRGB and with a transparent background makes it impossible to support subpixel antialiasing. But it makes dragging elements on the page smoother, which seem to be a major feature of Pages (at least in last year Steve keynote).

There is no problem with antialiasing in Safari because it draws first the background and then the text on this background. There is not much to gain by rendering web pages by layers.

Just to make my position clear: I did not say Pages cannot support subpixel antialiasing. But to support it they will most probably have to sacrifice live-dragging performance somewhere.

Pierre Igot

Thanks for the clarification, Michel. It’s still a bit obscure to me, but I think I got the gist of it. It’s about being able to render transparent stuff without knowing what’s behind it beforehand, right?

In any case, I guess the core issue remains whether the live dragging thing is worth sacrificing text readability on LCD displays.

Michel Fortin

It’s about being able to render transparent stuff without knowing what’s behind it beforehand, right?


Lieven Baeten

Now there is a new piece to this puzzle: I am sitting behind a MacBook Pro and a PowerBook 15”. Some apps, e.g. Camino and Word, that use sub-pixel rendering on the PB, do not so on the MBP. Others, e.g. TextEdit, TextMate and Safari, use sub-pixel rendering on both machines. I remembered your informative blog entry and wonder if you see the explanation for this phenomenon?

Michel Fortin

From what you say Lieven, it seems that what makes the difference on your MacBook Pro is the API used to draw. Word probably use old QuickDraw routines, maybe Gecko (Camino’s web page rendering engine) does the same as it has common roots with the Mac OS 9 Mozilla suite. TextEdit, TextMate and Safari on the other hand use either Cocoa or CoreGraphics.

That said, I have no explanation for why this is happening. It could be a bug somewhere, or maybe this is voluntary on Apple’s part to push developers to update their code, or yet it could be because some Intel-specific QuickDraw code draws all text in an offscreen buffer for some strange reason. I can only speculate.


I have a question that may be a bit basic for this forum, but…

To my eyes, the text on a Mac always looks blurry. I work in Word and PowerPoint a lot, and if I try to do it on a Mac, the fuzziness makes it very very difficult. In Windows, however, the text is simply sharper and easier to look at.

I know I’m not the only one with this issue, as I’d seen a lot of commenting on the web about the antialiasing on a Mac, and how it even causes headaches for some people. Others tell me that they have bad eyesight anyway, so it’s not a problem. I still have good eyesight, and the fuzzy text does bother me.

When I asked one of the “geniuses” at an Apple store about the fuzziness of the text, he just plain dismissed my observation, and started spouting off on how the MacBook screen is the highest resolution screen there is. Now, having had about 10 more years of technical and scientific education than that fellow, and teaching Engineering at a university, I was a little bit offended, but didn’t press the issue.

Could somebody on this forum enlighten me on the issue? Is there a good solution for it?

Thanks in advance!

Michel Fortin

MS, I think this article does a pretty good explanation of the “fuziness” problem. It compares antialiasing in Mac OS X to the one of Mac OS 9 which is pretty similar to the one found in Windows in my opinion.

Subpixel antialiasing can help here by virtually increasing the horizontal resolution, but the ultimate true solution to this problem is to have a higher-resolution screen and a user interface scaled appropriately. This will come eventually.



Thanks for the link. I have read the article at some point during my “web research” on the topic.

I wish there was some more discussion or information as to what drives these “enhancements,” and what rationale Apple and Microsoft follow in developing them. Don’t remember the link, but I did come across a discussion of various patent issues regarding the text mapping / displaying, from which I got the sense that the issue is as much about who owns what IP as it is about usability.

From what I have read about the human visual system and display technology, the antialiasing approach / effort is somewhat misplaced, because of how the human eye and brain process the information. There is a good deal of “averaging” going on as it is, and attempts to “help” the eye seem to confuse the brain more than help.