Font Rendering: Respecting The Pixel Grid

I've finally determined What's Wrong With Apple's Font Rendering. As it turns out, there actually wasn't anything wrong with Apple's font rendering, per se. Apple simply chose a different font rendering philosophy, as Joel Spolsky explains:


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2007/06/font-rendering-respecting-the-pixel-grid.html

microsoft will feel significantly stupid when the average person has a monitor with 5 times the DPI that’s around now. pixel… grid… riiiight… no one will care in a couple years.

for now, i find the osx fonts a lot more pleasing to the eye, if they’re blurry i just make them bigger, it’s better for your eyes anyway.

Ever print a Word Doc that you’ve perfectly squeezed into 2 pages? Ever have it print on more than 2 pages? Is this still a problem in Vista and Office 10?

I haven’t used Office in years but generally I go from Doc-PDF-Printer to work around this problem.

Is this a problem in Mac OS X?

So there are two issues:

  1. Layout: Picking the pixel grid that will be rendered to (kerning, leading, tracking)
  2. Rasterization: Picking pixels to fill (grid-fitting, subpixel rendering, anti-aliasing, etc)

If you are more conservative with text layout you usually need to be more aggressive filling pixels during rasterization to make your fonts look good.

@Michael Graham Richard. FreeType is generally used in the Linux world. To my eye, it looks closer to the Apple way of rendering.

I’ve worked on projects that used FreeType for video games in the PC world. However, we rasterized glyphs out to a texture that were hardware rendered.

Hardware rendering has its own quirks. It’s called a pixel shader in the DirectX world and a fragment shader in the OpenGL world. Fragment shader is the more correct term because multisample anti-aliasing (MSAA) happens after the fragment shader.

MSAA is user controlled at the driver level. So basically all of your text that you’ve gone and painstakingly anti-aliased has blurred because the hardware attempts to anti-alias it again.

The One Laptop Per Child has a 200 DPI display (1200x900 on a 6"x4.5" screen)

I think I prefer the Apple version, the fonts in windows just seem too thin really. And it makes boldness far too extreme. But overall I’d never really noticed it even when switching between my os x to windows. Maybe it works better on Apple’s displays somehow? (Although I can’t possibly think how)

Interesting fact though, Ubuntu gives you four options of different font rendering methods to pick from. Has examples of the effect each has on it too, which is interesting to look at in the light of all of this debate. You can see it on tim-perry.co.uk/fonts.png .

Man, posting anything about the Mac really brings the morons out of the woodwork.

See, this is why I don’t have comments on my blog, Jeff. :wink:

Sounds like the old form vs. function debate to me. I think it is just a philosophical difference between MS and Apple. And then there is the *nix crowd which has neither.

Camz: “when you look a bit deeper into OS X you discover that their display is managed with display postscript.”

Not true. That was true for NeXT, OS X is not using it in any way. The display model of OS X is similar to some (early version) of PDF, but Quartz is no “display PDF” even if some have used that term.

Fred: “Apple has had WebKit running on Windows for years, it’s what renders the iTunes Music Store inside iTunes.”

Not true, at least not a few years ago according to a leading Webkit developer. http://weblogs.mozillazine.org/hyatt/archives/2004_06.html#005666

That could change though with time. If it had I would guess Apple blogosphere would have noticed it a heart beat :slight_smile:

Then about the fonts: They are highly subjective. color perception, panel, settings of aliasing, personal preferences, personal history with aliased text etc. all have a big impact on the outcome.

In some ways I find the ClearType way better, but in some ways the Apple way… I frankly don’t like either very much. OS X rendering is much too fuzzy, Cleartype is too light and its quality seems to depend much on panel and other issues than OS X’s. Maybe one of them will some day tuned more into my liking, but that would mean finding some kind of a middle ground.

BTW, I’m a Mac user too, sorry for posting a moronic comment.

Parveen:
Ever print a Word Doc that you’ve perfectly squeezed into 2 pages?
Ever have it print on more than 2 pages? Is this still a problem in
Vista and Office 10?

It’s not really an OS-related issue, and I haven’t used Office 10 at home since Office 2003 came out (and I don’t print at work unless someone’s thick-headed; I don’t even use Word for much work-related other than my resume, because they want it in a .doc file with a specific format)… Office 2007 (12) hasn’t caused this problem for me, but I have a tendency to preview my documents to make sure I set up all of the margins and so forth before I print the documents if I’m planning on turning them over to someone else. I really tend to think that Word has some issues when using it for text layout, but they seem to have gotten better in the last decade. Word also opens in the “Print Layout” view (at least for me) rather than “Draft”, so it has a tendency to be more accurate with page breaks and margins than it used to (simply because people expected it to be in the layout mode when it wasn’t, and didn’t always know there even was a separate mode).

One thing I wanted to point out for those who primarily use Macs and only occasionally use Windows - are you sure ClearType was turned on? It’s on by default in Vista, but in Windows XP it had to be explicitly enabled. It’s possible your bad experiences with font rendering on Windows has to do with the un-antialiased Windows XP text.

The trouble is that non-96-dpi displays still suck. The fonts may scale (depending on whether the developer implemented the call to CreateFont correctly!) but the pixel-fitting algorithm then means that the spacing changes subtly and controls may not be large enough for the expanded text. Also, the taskbar icons are scaled using the StretchBlt API’s pixel-doubling mechanism rather than picking a large icon and scaling it down.

Windows Vista has a new scheme for scaling the display but that results in really fuzzy results because ClearType is applied before scaling, and you get the fuzzy icons problem too. Also, it’s turned off by default (at least it was in RC2, I don’t believe I’ve checked this in RTM yet) and XP’s system is used by default. I think the new scheme requires hardware acceleration (Aero) to be turned on too.

The new scheme probably works great on 300-400dpi displays. It sucks at 120dpi.

Dear Jeff,

Remember PIXEL FONTS? That’s right. PIXEL FONTS. Google them. Get Flash. Enjoy.

“What do you respect more: the pixel grid, or the font designer?”

You went with the Grid? Time to ditch that Matrix screensaver, Jeff.

I think, font rendering of safari on macos x is better than safari on windows.

I prefer the Mac way. Some days I think Windows still look like the old Geos for CBM64 :wink:

A thought, if Safari has been ported to Windows did they port Cocoa? That would be a great scoop I think :slight_smile:

The iPhone, which runs OS X, the version of Safari that was released as Public Beta on Monday and uses the same font rendering technology, has a 160 DPI display. The future is on June 29th.

http://www.apple.com/iphone/technology/specs.html

Monitors with large numbers of dots per inch simply have no market at the moment as we lack an operating system and related software for that system that is writen in a resolution independant manner.

Take icons for an example. Most icon work I’ve seen is done with pixel maps. This means that the ideal display for icons is to have one pixel of icon image for the pixel on the screen. Sure, you can scale pixel images but it doesn’t look ideal.

If you were to take software that looks good on a 100 dpi monitor and try running it on a 200 dpi monitor then the icons will look best at half the size of 100dpi monitor. This is most likely not the effect that you were designing. Your goal had most likely create a inch wide graphic that appears an inch wide on anything: different monitors, printouts, whatever.

The dream would be to encode your data in a resolution independant manner - with something like vector graphics - so that regardless of the screen resolution you would still see your icon taking up the same amount of space on the screen.

We need an OS and software that is resolution independant before we can see higher resolution monitors. Because most guis are designed in a resolution dependant manner (all guis that I’ve worked on at some point do something on the gui in terms of number of pixels and not it units of inches) upping the resolution would suddenly make a lot of software look bad.

We’re not seeing high resolution monitors for a simple reason: there’s a lot of negative effects if they became widely used.

What we’re seeing with the Apple’s fonts is Apple trying to move to a resolution independant world. It’s a hard step to take as there is so much resolution dependant legacy code out there. It’s hard to add new stuff like resolution independance while not breaking everything else on the planet.

“The iPhone, which runs OS X, the version of Safari that was released as Public Beta on Monday and uses the same font rendering technology, has a 160 DPI display. The future is on June 29th.”

Y’all speaking about a magic leap from 100 to 200dpi! Please be aware that classic fonts were designed when there was no “resolution” concept - typesets were actually lead castings. These days, professional print quality begins at 1200dpi, going up to 3600 dpi. A 600 dpi laser printer is barely enough for printing proofs, and any fonts on 300dpi lasers look like crap.

So, let’s face it, there is no point to preserve the print-oriented font designs on displays. I mentioned “greek text below…” setting before, to emphasize the fact that designing page layouts and reading text are two completely different tasks.

For the monitors, we need a totally different design approach - which ClearType achieved just fine. In PrintPreview, though, rendering as close to printed shapes as possible, is really a good idea.

if I am not mistaken, a 17" screen would give you a resolution of 1280x1024 (which is what I use), while it is a 15" that would give you the resolution you spoke of, 1024x768.

It’s funny to see all those blurry fonts, and that people tolerate them. There’s still nothing as crisp as good old X11 bitmapped fonts, and those are what I prefer in my text editor or terminal window. In the browser, whatever freetype is giving me with those Bitstream Vera fonts on Ubuntu is fine.

Jeff, I think I had a different post on same subject on this…
Just check it. Your previous post was the trigger for my post
Please check-it.
http://sarathc.wordpress.com/2007/06/14/safari-is-better-in-complex-unicode-font-sub-pixel-rendering/
Anyway I’m not an expert in font and it’s rendering technologies.