setting X server DPI

Tom Horsley tom.horsley at att.net
Tue Mar 10 19:22:26 UTC 2009


On Tue, 10 Mar 2009 13:35:51 -0430
Patrick O'Callaghan wrote:

> This true but it shouldn't be. It's true because the sizes of things in
> X are defined in terms of pixels, and it's wrong because 12pt type is
> 12pt, no matter what medium it's on. It's an absolute size, not a given
> number of pixels.

No they aren't. All the font rendering libraries these days take
point sizes at the primary means of specifying font size, but the point
(he-he :-) is that the physical device eventually renders the fonts
by turning pixels on or off. If you only have 40 pixels per inch, then
a 12 point font is gonna need to be rendered on that device in
((12/72)*40 pixels (which comes out to 7 even if you round up), then
considering that the lower case characters are only about half height
and you have a grand total of 3 or 4 pixels available to render the
entire set of glyphs in a font. No can read :-). Even with anti-aliasing
and greyscale values for the pixels, rendering readable characters
cannot be done.

Often this is what you want. If you are preparing a print preview,
unreadable little smudges that give you an idea of where the line
goes is perfectly OK, but if you are trying to read menus and
dialog boxes and get work done it is hopeless.

You could re-write every application in the universe to carefully
deduce some "readability" factor based on the available pixels
to render the requested font size and request a different size
font for thing it intends to be readable, or you can lie about
the DPI and not rewrite every single app in the universe. Guess
which one is more practical :-).

Just like you can calculate orbits with an earth-centric model
and epicycles within epicycles, or you could use a sun-centric
model and a simple ellipse. They both get the same result, but
one is a heck of a lot simpler than the other.




More information about the fedora-list mailing list