Speaking of fonts...

Matthew Saltzman mjs at ces.clemson.edu
Mon Jul 10 14:26:54 UTC 2006


On Mon, 10 Jul 2006, Tim wrote:

> On Sun, 2006-07-09 at 20:03 -0400, Tom Horsley wrote:
>> If I correctly set the DPI for my Westinghouse LVM-42w2
>> 42" 1920x1080 dpi monitor, the setting is 52dpi, and at
>> 52 dpi everyone seems to draw fonts that are somewhere
>> around 4 or 5 pixels high and mostly unreadable even
>> if you paste your eyeball to the screen :-).
>
> Yep, you're stuffed, because the people providing those options haven't
> understood the situation properly (those who design font rendering
> engines, and those who get you to set font sizes in pixels in web
> browser configurations).  I get the same thing (stupid font sizes).
>
> DPI means how many dots per inch, or pixels in that inch, will be used
> to draw the character (how smooth the edges are).  It has *NOTHING* to
> do with how big the character should be drawn.

There's no magic about "points" either.  There are 72 of them in an inch 
and that's all one needs to know about them.

>
> Geez, that problem was dealt with properly on printers ten years ago.
> Changing your printer from 24 DPI to 48 DPI, for example, didn't change
> the font sizes, unless the driver was written by a complete and utter
> moron.  You just got smoother looking fonts, at the same size.

Right.  The printer just doubled the number of dots in an inch--it slowed 
dot-matrix printers to half speed and used twice as much ink.

But the printer or its device-specific driver knows how wide/tall a sheet 
of paper is and the DPI in any mode is fixed.  So it's easy for the 
printer or driver to know how many dots are in a point.

Displays can't change the number of dots in an inch.  Display drivers 
could prespecify the DPI, and I don't really know why they don't.  But to 
get accurate characters, the display driver needs to know the right number 
of DPI (and that's really it).  The same screen resolution (in pixels) 
will have different DPI settings depending on the physical size of the 
screen.  1280x1024 on a 17-inch CRT necessarily means more DPI than 
1280x1024 on a 19-inch CRT and that's more than 1280x1024 on a 19-inch 
LCD.

The font-scaling math converts, say, 8-point type (1/9-inch high) to the 
best representation it can get given that there are 96 (or 52, or 84, or 
114) actual pixels in an inch.

>
> In this day and age of using scaled/vector type of fonts, for just about
> everything, I don't know why this old, treat it like a bitmap, stupidity
> still exists.

Ultimately, modern low-cost displays and printers *are* just bitmaps. 
Even if the electron guns draw vectors like they did in the early days of 
high-end graphics, the image is drawn on phosphors on the screen (pixels). 
All the vector math is devoted to converting the vector description to the 
best possible bitmapped rendering.

LCD displays have the additional ability to use the colored subpixels to 
get even better effects.  But to do the conversion accurately, one needs 
to know how to map points to pixels, i.e., the true DPI.

-- 
 		Matthew Saltzman

Clemson University Math Sciences
mjs AT clemson DOT edu
http://www.math.clemson.edu/~mjs




More information about the fedora-list mailing list