Hi,
I'm working on bringing OLPC up-to-date with all the great efforts with GNOME 3, systemd, etc.
On the OLPC XO laptops we have quite a strange screen - it is small (152mm x 114mm) but very high resolution (1200x900 i.e. 201 dots per inch).
Previously, on Fedora 14, we had to adjust the default GNOME font sizes since they didn't look right on the screen (I think they were too big). Now I'm looking at applying the same set of customisations to Fedora 16 since the default fonts are uncomfortably small on our display.
However, I've noticed a fundamental difference in the sizing of fonts between Fedora 14 and Fedora 16. This is visible with a simple experiment:
1. Open gedit 2. Change document font size to Sans 72 3. Write the capital letter "I" and measure the height of the printed character with a ruler
I do this on two laptops side by side, one running Fedora 14 and the other running Fedora 16. On F14 the height of the I character is 1.9cm, and on Fedora 16 it is 0.9cm. That is quite a difference.
On both laptops, xdpyinfo correctly prints the screen resolution, DPI and display size, which have not changed.
From a typographic standpoint, F14 seems to be correct here. As 1pt is
(approx) 1/72 of an inch, size 72 should produce characters of around 1 inch in size - and 1.9cm (the F14 measurement) is about an inch.
Also, Cantarell seems to play by its own rules. On F16, the "I" in Cantarell 72 is 0.8cm, not too different from Sans 72, but the difference between Sans 11 and Cantarell 11 is more significant - at size 11, Cantarell is tiny.
Can anyone help me understand this behaviour?
Thanks, Daniel
On 2011/09/30 11:00 (GMT+0100) Daniel Drake composed:
I'm working on bringing OLPC up-to-date with all the great efforts with GNOME 3, systemd, etc.
On the OLPC XO laptops we have quite a strange screen - it is small (152mm x 114mm) but very high resolution (1200x900 i.e. 201 dots per inch).
Previously, on Fedora 14, we had to adjust the default GNOME font sizes since they didn't look right on the screen (I think they were too big). Now I'm looking at applying the same set of customisations to Fedora 16 since the default fonts are uncomfortably small on our display.
However, I've noticed a fundamental difference in the sizing of fonts between Fedora 14 and Fedora 16. This is visible with a simple experiment:
- Open gedit
- Change document font size to Sans 72
- Write the capital letter "I" and measure the height of the printed
character with a ruler
I do this on two laptops side by side, one running Fedora 14 and the other running Fedora 16. On F14 the height of the I character is 1.9cm, and on Fedora 16 it is 0.9cm. That is quite a difference.
On both laptops, xdpyinfo correctly prints the screen resolution, DPI and display size, which have not changed.
From a typographic standpoint, F14 seems to be correct here. As 1pt is
(approx) 1/72 of an inch, size 72 should produce characters of around 1 inch in size - and 1.9cm (the F14 measurement) is about an inch.
Also, Cantarell seems to play by its own rules. On F16, the "I" in Cantarell 72 is 0.8cm, not too different from Sans 72, but the difference between Sans 11 and Cantarell 11 is more significant - at size 11, Cantarell is tiny.
Can anyone help me understand this behaviour?
Sounds to me like your F14 is using correct DPI while your F16 is forced to 96. Does your F14 have /etc/X11/xorg.conf file or a non-empty /etc/X11/xorg.conf.d/?
Can you try opening Firefox 3.x with hidden (about:config) pref layout.css.dpi set to 0, and again set to 201, and loading http://fm.no-ip.com/Auth/dpi-screen-window.html to see what DPI it reports? Same in Konqueror? (other/newer browsers lock to 96).
On Fri, Sep 30, 2011 at 12:09 PM, Felix Miata mrmazda@earthlink.net wrote:
Sounds to me like your F14 is using correct DPI while your F16 is forced to 96. Does your F14 have /etc/X11/xorg.conf file or a non-empty /etc/X11/xorg.conf.d/?
Bad DPI could certainly be a cause. However, xdpyinfo reports the correct value (201) on both platforms.
We use a config file in /etc/X11/xorg.conf.d which specifies DisplaySize - needed for the correct DPI value to be computed.
Can you try opening Firefox 3.x with hidden (about:config) pref layout.css.dpi set to 0, and again set to 201, and loading http://fm.no-ip.com/Auth/dpi-screen-window.html to see what DPI it reports? Same in Konqueror? (other/newer browsers lock to 96).
Sure, I'll try this. Do I run these tests on F14 or F16? (do you really mean Firefox 3.x?)
cheers Daniel
On 2011/09/30 13:14 (GMT+0100) Daniel Drake composed:
Felix Miata wrote:
Sounds to me like your F14 is using correct DPI while your F16 is forced to 96. Does your F14 have /etc/X11/xorg.conf file or a non-empty /etc/X11/xorg.conf.d/?
Bad DPI could certainly be a cause. However, xdpyinfo reports the correct value (201) on both platforms.
There is actually a possibility for 3 different DPIs to be recognized by various apps on a single X desktop. xdpyinfo only reports one of the 3, which is why I asked to open that URL in Firefox.
We use a config file in /etc/X11/xorg.conf.d which specifies DisplaySize - needed for the correct DPI value to be computed.
Check if your F14 /etc/X11/xorg.conf.d/ has valid device and screen specified while your F16 does not. I think post-F14 Xorg behavior in this regard and/or default files and/or docs about them changed.
Can you try opening Firefox 3.x with hidden (about:config) pref layout.css.dpi set to 0, and again set to 201, and loading http://fm.no-ip.com/Auth/dpi-screen-window.html to see what DPI it reports? Same in Konqueror? (other/newer browsers lock to 96).
Sure, I'll try this. Do I run these tests on F14 or F16?
Both, if device and screen in xorg.conf.d/ aren't your F16 problem, otherwise neither. Might be easier to do in F14 unless you're familiar with using the mozilla.org static binaries. Oh, and don't use your regular profile(s). Start FF with -profilemanager and create a new one to use. I think you can damage your 4/5/6/7 profile by using it for an older version and then going back to 4/5/6/7. I'm not sure which of the post-3.x versions is responsible for the to/fro incompatibility. Or else backup first, run the tests, then restore.
(do you really mean Firefox 3.x?)
"other/newer browsers lock to 96", which means they will only report 96 (or maybe 192?, since you have an actual 201 DPI) unless you've altered layout.css.devPixelsPerPx, in which case desktop DPI & reported DPI won't likely correlate positively.
Le Ven 30 septembre 2011 12:00, Daniel Drake a écrit :
Can anyone help me understand this behaviour?
<rant>
There's nothing to understand – this is a new major GNOME release, with developers that know better than everyone else, and solve problems by ignoring past experience and hardcoding their own preferences
Someone Gnome-side decided to not trust xorg dpi and added a new heuristic to 'correct' it (the last time this occurred, it took several years of user complains before it was reverted; I'm quite sure there will be a new round of excuses why it is a good idea to try to second-guess xorg hardware detection instead of fixing the eventual xorg bugs. What it boils down to is some people GNOME-side have less work to configure their hardware – around which the new heuristic has been constructed — everyone else gets weird unwelcome side-effects, and apps using other toolkits won't agree on what font sizes mean)
Another someone decided DejaVu (what you call Sans) was too old and tired, and preempted it with a new unfinished font. It seems people do not understand UI fonts are there to display text, and a font people do not notice at all is a good UI font. Mind you, Cantarell is a nice free and open font, but did it really need showing down people's throats to be advertised? Especially considering its coverage is too small to support a lot of languages, and its metric is too different from the available fall-back fonts for the fall-backs to be graceful?
There's nothing to do apart from waiting for enough complains to pile up the people in charge get past their reality denial phase. </rant>
On Fri, Sep 30, 2011 at 2:47 PM, Nicolas Mailhot nicolas.mailhot@laposte.net wrote:
Someone Gnome-side decided to not trust xorg dpi and added a new heuristic to 'correct' it
I know this is part of a rant.. but any chance you could quantify that point with a link to a commit, blog post, mailing list discussion, something like that?
cheers Daniel
On Fri, Sep 30, 2011 at 02:59:58PM +0100, Daniel Drake wrote:
On Fri, Sep 30, 2011 at 2:47 PM, Nicolas Mailhot nicolas.mailhot@laposte.net wrote:
Someone Gnome-side decided to not trust xorg dpi and added a new heuristic to 'correct' it
I know this is part of a rant.. but any chance you could quantify that point with a link to a commit, blog post, mailing list discussion, something like that?
Does http://people.gnome.org/~federico/news-2007-01.html count? Or http://svn.gnome.org/viewvc/gnome-settings-daemon/branches/gnome-2-24/plugin... (line 249)?
On Fri, Sep 30, 2011 at 8:47 AM, Tomasz Torcz tomek@pipebreaker.pl wrote:
Or http://svn.gnome.org/viewvc/gnome-settings-daemon/branches/gnome-2-24/plugin... (line 249)?
I'm not sure that's relevant for the current codebase. But even so if you look at 73-75 the high and low reasonable limits don't seem unreasonable to me. And since the OLPC screen hardware being described is between the high/low reasonable limits in that particular bit of code you are pointing to, the fall back logic wouldn't fire so it couldn't be the cuase of the particular technical issue which started this thread.
-jef
On Fri, Sep 30, 2011 at 6:05 PM, Jef Spaleta jspaleta@gmail.com wrote:
On Fri, Sep 30, 2011 at 8:47 AM, Tomasz Torcz tomek@pipebreaker.pl wrote:
Or http://svn.gnome.org/viewvc/gnome-settings-daemon/branches/gnome-2-24/plugin... (line 249)?
I'm not sure that's relevant for the current codebase. But even so if you look at 73-75 the high and low reasonable limits don't seem unreasonable to me. And since the OLPC screen hardware being described is between the high/low reasonable limits in that particular bit of code you are pointing to, the fall back logic wouldn't fire so it couldn't be the cuase of the particular technical issue which started this thread.
Jef, you're right, but Tomasz's link did send me in the right direction. Thanks!
Here is the equivalent code of today: http://git.gnome.org/browse/gnome-settings-daemon/tree/plugins/xsettings/gsd...
Discussion: https://bugzilla.gnome.org/show_bug.cgi?id=643704
Summary: GNOME hardcodes DPI to 96 regardless of X configuration.
However, it now has a text scaling factor in gsettings that I was not aware of. So I guess I just need to find an appropriate factor that makes fonts look OK on the XO.
cheers Daniel
On Fri, Sep 30, 2011 at 11:30 PM, Daniel Drake dsd@laptop.org wrote:
However, it now has a text scaling factor in gsettings that I was not aware of. So I guess I just need to find an appropriate factor that makes fonts look OK on the XO.
One problem faced here is that (as noted earlier) Cantarell behaves quite differently to Sans (DejaVu) in terms of scaling at different sizes, and the default setup mixes these 2 fonts. This problem gets amplified when applying large scale factors.
So I settled for a scale factor of 2.1. This makes the document font and window title font (both Sans) look the correct size, but everything else (Cantarell) is uncomfortably large. Then I changed the default font from Cantarell to Sans and now everything looks fine.
So my final override:
[org.gnome.desktop.interface] cursor-size=48 text-scaling-factor=2.1 font-name='Sans 7'
Thanks !
Daniel
<rant> On Fri, Sep 30, 2011 at 8:53 PM, Kevin Kofler kevin.kofler@chello.at wrote:
Daniel Drake wrote:
Summary: GNOME hardcodes DPI to 96 regardless of X configuration.
This is very broken.
Gnome: Reliving Window's horrible past, one emulated bug at a time.
At least we can be thankful that unlike windows, gnome doesn't have the market force required for their flaws to retard the availability of displays with reasonable pixel densities.
</rant>
On Fri, 2011-09-30 at 15:47 +0200, Nicolas Mailhot wrote:
There's nothing to do apart from waiting for enough complains to pile up the people in charge get past their reality denial phase.
</rant>
This thread could have let to something constructive... but not so much anymore now, I guess. Good going, getting down to the rant level in less than 10 mails :-(
On Fri, Sep 30, 2011 at 3:35 PM, Matthias Clasen mclasen@redhat.com wrote:
On Fri, 2011-09-30 at 15:47 +0200, Nicolas Mailhot wrote:
There's nothing to do apart from waiting for enough complains to pile up the people in charge get past their reality denial phase.
</rant>
This thread could have let to something constructive... but not so much anymore now, I guess. Good going, getting down to the rant level in less than 10 mails :-(
There's no reason it can't be recovered, Matthias can you provide an explanation of the changes and the rationale behind them?
Peter
On 2011/09/30 11:09 (GMT-0400) Peter Robinson composed:
can you provide an explanation of the changes and the rationale behind them?
I think it's become clear over the past couple of years that the Gnome and KDE devs have decided they're controlling a playgound rather than software for users to be productive with, expecting those who don't like their silly power sapping toys to goto XFCE, LXDE or elsewhere to absolve themselves of the effects of naive and anarchist devs.
On Fri, Sep 30, 2011 at 12:12:45PM -0400, Felix Miata wrote:
I think it's become clear over the past couple of years that the Gnome and KDE devs have decided they're controlling a playgound rather than software for users to be productive with, expecting those who don't like their silly power sapping toys to goto XFCE, LXDE or elsewhere to absolve themselves of the effects of naive and anarchist devs.
+1
Well said...
/me being a pretty happy Xfce user since F15.
On 30/09/11 17:12, Felix Miata wrote:
On 2011/09/30 11:09 (GMT-0400) Peter Robinson composed:
can you provide an explanation of the changes and the rationale behind them?
I think it's become clear over the past couple of years that the Gnome and KDE devs have decided they're controlling a playgound rather than software for users to be productive with, expecting those who don't like their silly power sapping toys to goto XFCE, LXDE or elsewhere to absolve themselves of the effects of naive and anarchist devs.
I don't use Gnome since F10? But, what I surmise is, if you don't progress you stagnate.
Rawhide, with hindsight would probably, have been the better candidate for any major advance. When more of the bugs\gripes could have been ironed out. Then place it in FN+1 Branched for greater feedback.
Maybe it has been done that way, honestly don't know. But introduce in Rawhide, instead of Branched.
Felix Miata wrote:
I think it's become clear over the past couple of years that the Gnome and KDE devs have decided they're controlling a playgound rather than software for users to be productive with
What does KDE have to do with this? KDE honors your screen's physical DPI by default. (It has done so for years.)
Kevin Kofler
On Fri, 2011-09-30 at 16:09 +0100, Peter Robinson wrote:
On Fri, Sep 30, 2011 at 3:35 PM, Matthias Clasen mclasen@redhat.com wrote:
On Fri, 2011-09-30 at 15:47 +0200, Nicolas Mailhot wrote:
There's nothing to do apart from waiting for enough complains to pile up the people in charge get past their reality denial phase.
</rant>
This thread could have let to something constructive... but not so much anymore now, I guess. Good going, getting down to the rant level in less than 10 mails :-(
There's no reason it can't be recovered, Matthias can you provide an explanation of the changes and the rationale behind them?
They're documented in the code,, which has a link to the bug. We basically can't trust the DPI X.org gives us. Which is something 2 people who have been working on GTK+ for > 10 years mention on the bug, and I've double/triple checked with X hackers.
I'm not sure how we can make DPI magically be correct in gazillions of broken displays' EDID.
On 10/3/11 11:43 AM, Jan Kratochvil wrote:
On Mon, 03 Oct 2011 17:34:45 +0200, Bastien Nocera wrote:
I'm not sure how we can make DPI magically be correct in gazillions of broken displays' EDID.
If not blacklisting then whitelisting them, you have the community. This is X.org's task, though.
Speaking as Xorg: No, we don't. You have _no_ idea how many displays there are. There's nothing like effective coverage to be had here.
More to the point, your DPI numbers would be per-output anyway, so there's no picking a single point size preference, the same size in pixels would be different sizes in millimeters on each output.
You can't win at this. Don't try.
- ajax
Le Lun 3 octobre 2011 18:56, Adam Jackson a écrit :
More to the point, your DPI numbers would be per-output anyway, so there's no picking a single point size preference, the same size in pixels would be different sizes in millimeters on each output.
So what? Yes dpi needs to be per-output
I've seen extended desktops (fixed screens + laptop screens) where forcing the same pixel adjustment on both screens resulted in very unpleasant lens-like effect when moving a window from one screen to another. On one screen the text was too big, on the other way too small
And this is going to get worse with fixed-size laptop screens that try to squeeze HD video resolutions for marketing reasons, and fixed screens that try to pretend they are TVs and extend way past their own pixel resolution.
Hardcoding 96 dpi does not fix anything, it just exchanges the (solvable) hardware detection support problem (which is not the DE problem anyway, at most the DE should provide a 'I've broken hardware, override detection' switch), with (unsolvable) font size coherence problems. Right now *no* non-gnome3 app will agree with gnome font sizes. Even on single-screen systems with good edid. This is not user-friendly at all.
On Mon, 2011-10-03 at 12:56 -0400, Adam Jackson wrote:
More to the point, your DPI numbers would be per-output anyway, so there's no picking a single point size preference, the same size in pixels would be different sizes in millimeters on each output.
In fairness, for my dual head setup I did what I always do: I bought a matched pair of monitors from the same batch. EDID seems sane, too.
Jon.
On 10/03/2011 10:48 AM, Camilo Mesias wrote:
Hi,
A daft question perhaps, but I thought...
I'm not sure how we can make DPI magically be correct in gazillions of broken displays' EDID.
How do other OS' do it?
I don't know that they do. In my use of Windows up through XP, I never saw evidence of it re-scaling fonts based on DPI, at least for general UI use (word processors may do so).
- Michael
On 10/03/2011 06:01 PM, Michael Ekstrand wrote:
On 10/03/2011 10:48 AM, Camilo Mesias wrote:
Hi,
A daft question perhaps, but I thought...
I'm not sure how we can make DPI magically be correct in gazillions of broken displays' EDID.
How do other OS' do it?
I don't know that they do. In my use of Windows up through XP, I never saw evidence of it re-scaling fonts based on DPI, at least for general UI use (word processors may do so).
The XP I occasionally can not avoid to use, in its system control menus has controls to switch between "normal", "big" "very big" fonts and "expert/advanced controls" one can specify fonts sizes for many details of the DE in pnts.
Ralf
Hi,
On Mon, Oct 3, 2011 at 5:22 PM, Ralf Corsepius rc040203@freenet.de wrote:
The XP I occasionally can not avoid to use, in its system control menus has controls to switch between "normal", "big" "very big" fonts and "expert/advanced controls" one can specify fonts sizes for many details of the DE in pnts.
Wouldn't the sane solution to be to honour the fallible DPI detection, with an expert tweak available to rescue those who have unusual hardware (or preferences)? I can't see the justification for the present override.
-Cam
On Tue, Oct 4, 2011 at 04:01, Camilo Mesias camilo@mesias.co.uk wrote:
On Mon, Oct 3, 2011 at 5:22 PM, Ralf Corsepius rc040203@freenet.de wrote:
The XP I occasionally can not avoid to use, in its system control menus has controls to switch between "normal", "big" "very big" fonts and "expert/advanced controls" one can specify fonts sizes for many details of the DE in pnts.
Wouldn't the sane solution to be to honour the fallible DPI detection, with an expert tweak available to rescue those who have unusual hardware (or preferences)? I can't see the justification for the present override.
No. It wouldn't.
On 10/04/2011 09:36 AM, Jason D. Clinton wrote:
On Tue, Oct 4, 2011 at 04:01, Camilo Mesiascamilo@mesias.co.uk wrote:
On Mon, Oct 3, 2011 at 5:22 PM, Ralf Corsepiusrc040203@freenet.de wrote:
The XP I occasionally can not avoid to use, in its system control menus has controls to switch between "normal", "big" "very big" fonts and "expert/advanced controls" one can specify fonts sizes for many details of the DE in pnts.
Wouldn't the sane solution to be to honour the fallible DPI detection, with an expert tweak available to rescue those who have unusual hardware (or preferences)? I can't see the justification for the present override.
No. It wouldn't.
Not exactly the most compelling of arguments. ;-)
Grovelling around in the F15 xorg-server sources and reviewing the Xorg log file on my F15 box, I see, with _modern hardware_ at least, that we do have the monitor geometry available from DDC or EDIC, and obviously it is trivial to compute the actual, correct DPI for each screen.
Obviously in a multi-screen set-up using Xinerama this has the potential to be a Hard Problem if the monitors differ greatly in their DPI.
If the major resistance is over what to do with older hardware that doesn't have this data available, then yes, punt; use a hard-coded default. Likewise, if the two monitors really differ greatly, then punt.
Beyond that I'd say this violates POLA if the data is available and the xserver doesn't honor and correctly report it.
And it wouldn't be so hard to to add something like -dpi:0, -dpi:1, -dpi:2 command line options to specify per-screen dpi. I kinda thought I did that a long, long time ago, but maybe I only thought about doing it and never actually got around to it.
My 2¢ worth.
--
Kaleb
On Tue, 2011-10-04 at 11:46 -0400, Kaleb S. KEITHLEY wrote:
Grovelling around in the F15 xorg-server sources and reviewing the Xorg log file on my F15 box, I see, with _modern hardware_ at least, that we do have the monitor geometry available from DDC or EDIC, and obviously it is trivial to compute the actual, correct DPI for each screen.
I am clearly going to have to explain this one more time, forever. Let's see if I can't write it authoritatively once and simply answer with a URL from here out. (As always, use of the second person "you" herein is plural, not singular.)
EDID does not reliably give you the size of the display.
Base EDID has at least two different places where you can give a physical size (before considering extensions that aren't widely deployed so whatever). The first is a global property, measured in centimeters, of the physical size of the glass. The second is attached to your (zero or more) detailed timing specifications, and reflects the size of the mode, in millimeters.
So, how does this screw you?
a) Glass size is too coarse. On a large display that cm roundoff isn't a big deal, but on subnotebooks it's a different game. The 11" MBA is 25.68x14.44 cm, so that gives you a range of 52.54-54.64 dpcm horizontal and 51.20-54.86 dpcm vertical (133.4-138.8 dpi h and 130.0-139.3 dpi v). Which is optimistic, because that's doing the math forward from knowing the actual size, and you as the EDID parser can't know which way the manufacturer rounded.
b) Glass size need not be non-zero. This is in fact the usual case for projectors, which don't have a fixed display size since it's a function of how far away the wall is from the lens.
c) Glass size could be partially non-zero. Yes, really. EDID 1.4 defines a method of using these two bytes to encode aspect ratio, where if vertical size is 0 then the aspect ratio is computed as (horizontal value + 99) / 100 in portrait mode (and the obvious reverse thing if horizontal is zero). Admittedly, unlike every other item in this list, I've never seen this in the wild. But it's legal.
d) Glass size could be a direct encoding of the aspect ratio. Base EDID doesn't condone this behaviour, but the CEA spec (to which all HDMI monitors must conform) does allow-but-not-require it, which means your 1920x1080 TV could claim to be 16 "cm" by 9 "cm". So of course that's what TV manufacturers do because that way they don't have to modify the EDID info when physical construction changes, and that's cheaper.
e) You could use mode size to get size in millimeters, but you might not have any detailed timings.
f) You could use mode size, but mode size is explicitly _not_ glass size. It's the size that the display chooses to present that mode. Sometimes those are the same, and sometimes they're not. You could be scaled or {letter,pillar}boxed, and that's not necessarily something you can control from the host side.
g) You could use mode size, but it could be an encoded aspect ratio, as in case d above, because CEA says that's okay.
h) You could use mode size, but it could be the aspect ratio from case d multiplied by 10 in each direction (because, of course, you gave size in centimeters and so your authoring tool just multiplied it up).
i) Any or all of the above could be complete and utter garbage, because - and I really, really need you to understand this - there is no requirements program for any commercial OS or industry standard that requires honesty here, as far as I'm aware. There is every incentive for there to _never_ be one, because it would make the manufacturing process more expensive.
So from this point the suggestion is usually "well come up with some heuristic to make a good guess assuming there's some correlation between the various numbers you're given". I have in fact written heuristics for this, and they're in your kernel and your X server, and they still encounter a huge number of cases where we simply _cannot_ know from EDID anything like a physical size, because - to pick only one example - the consumer electronics industry are cheap bastards, because you the consumer demanded that they be cheap.
And then your only recourse is to an external database, and now you're up the creek again because the identifying information here is a vendor/model/serial tuple, and the vendor can and does change physical construction without changing model number. Now you get to play the guessing game of how big the serial number range is for each subvariant, assuming they bothered to encode a serial number - and they didn't. Or, if they bothered to encode week/year of manufacturer correctly - and they didn't - which weeks meant which models. And then you still have to go out and buy one of every TV at Fry's, and that covers you for one market, for three months.
If someone wants to write something better, please, by all means. If it's kernel code, send it to dri-devel@lists.freedesktop.org and cc me and I will happily review it. Likewise xorg-devel@ for X server changes.
I gently suggest that doing so is a waste of time.
But if there's one thing free software has taught me, it's that you can not tell people something is a bad idea and have any expectation they will believe you.
Obviously in a multi-screen set-up using Xinerama this has the potential to be a Hard Problem if the monitors differ greatly in their DPI.
If the major resistance is over what to do with older hardware that doesn't have this data available, then yes, punt; use a hard-coded default. Likewise, if the two monitors really differ greatly, then punt.
I'm going to limit myself to observing that "greatly" is a matter of opinion, and that in order to be really useful you'd need some way of communicating "I punted" to the desktop.
Beyond that, sure, pick a heuristic, accept that it's going to be insufficient for someone, and then sit back and wait to get second-guessed on it over and over.
And it wouldn't be so hard to to add something like -dpi:0, -dpi:1, -dpi:2 command line options to specify per-screen dpi. I kinda thought I did that a long, long time ago, but maybe I only thought about doing it and never actually got around to it.
The RANDR extension as of version 1.2 does allow you to override physical size on a per-output basis at runtime. We even try pretty hard to set them as honestly as we can up front. The 96dpi thing people complain about is from the per-screen info, which is simply a default because of all the tl;dr above; because you have N outputs per screen which means a single number is in general useless; and because there is no way to refresh the per-screen info at runtime, as it's only ever sent in the initial connection handshake.
- ajax
Thanks for writing this up! It was good info. On Oct 4, 2011 7:55 PM, "Adam Jackson" ajax@redhat.com wrote:
On Tue, 2011-10-04 at 11:46 -0400, Kaleb S. KEITHLEY wrote:
Grovelling around in the F15 xorg-server sources and reviewing the Xorg log file on my F15 box, I see, with _modern hardware_ at least, that we do have the monitor geometry available from DDC or EDIC, and obviously it is trivial to compute the actual, correct DPI for each screen.
I am clearly going to have to explain this one more time, forever. Let's see if I can't write it authoritatively once and simply answer with a URL from here out. (As always, use of the second person "you" herein is plural, not singular.)
EDID does not reliably give you the size of the display.
Base EDID has at least two different places where you can give a physical size (before considering extensions that aren't widely deployed so whatever). The first is a global property, measured in centimeters, of the physical size of the glass. The second is attached to your (zero or more) detailed timing specifications, and reflects the size of the mode, in millimeters.
So, how does this screw you?
a) Glass size is too coarse. On a large display that cm roundoff isn't a big deal, but on subnotebooks it's a different game. The 11" MBA is 25.68x14.44 cm, so that gives you a range of 52.54-54.64 dpcm horizontal and 51.20-54.86 dpcm vertical (133.4-138.8 dpi h and 130.0-139.3 dpi v). Which is optimistic, because that's doing the math forward from knowing the actual size, and you as the EDID parser can't know which way the manufacturer rounded.
b) Glass size need not be non-zero. This is in fact the usual case for projectors, which don't have a fixed display size since it's a function of how far away the wall is from the lens.
c) Glass size could be partially non-zero. Yes, really. EDID 1.4 defines a method of using these two bytes to encode aspect ratio, where if vertical size is 0 then the aspect ratio is computed as (horizontal value + 99) / 100 in portrait mode (and the obvious reverse thing if horizontal is zero). Admittedly, unlike every other item in this list, I've never seen this in the wild. But it's legal.
d) Glass size could be a direct encoding of the aspect ratio. Base EDID doesn't condone this behaviour, but the CEA spec (to which all HDMI monitors must conform) does allow-but-not-require it, which means your 1920x1080 TV could claim to be 16 "cm" by 9 "cm". So of course that's what TV manufacturers do because that way they don't have to modify the EDID info when physical construction changes, and that's cheaper.
e) You could use mode size to get size in millimeters, but you might not have any detailed timings.
f) You could use mode size, but mode size is explicitly _not_ glass size. It's the size that the display chooses to present that mode. Sometimes those are the same, and sometimes they're not. You could be scaled or {letter,pillar}boxed, and that's not necessarily something you can control from the host side.
g) You could use mode size, but it could be an encoded aspect ratio, as in case d above, because CEA says that's okay.
h) You could use mode size, but it could be the aspect ratio from case d multiplied by 10 in each direction (because, of course, you gave size in centimeters and so your authoring tool just multiplied it up).
i) Any or all of the above could be complete and utter garbage, because
- and I really, really need you to understand this - there is no
requirements program for any commercial OS or industry standard that requires honesty here, as far as I'm aware. There is every incentive for there to _never_ be one, because it would make the manufacturing process more expensive.
So from this point the suggestion is usually "well come up with some heuristic to make a good guess assuming there's some correlation between the various numbers you're given". I have in fact written heuristics for this, and they're in your kernel and your X server, and they still encounter a huge number of cases where we simply _cannot_ know from EDID anything like a physical size, because - to pick only one example - the consumer electronics industry are cheap bastards, because you the consumer demanded that they be cheap.
And then your only recourse is to an external database, and now you're up the creek again because the identifying information here is a vendor/model/serial tuple, and the vendor can and does change physical construction without changing model number. Now you get to play the guessing game of how big the serial number range is for each subvariant, assuming they bothered to encode a serial number - and they didn't. Or, if they bothered to encode week/year of manufacturer correctly - and they didn't - which weeks meant which models. And then you still have to go out and buy one of every TV at Fry's, and that covers you for one market, for three months.
If someone wants to write something better, please, by all means. If it's kernel code, send it to dri-devel@lists.freedesktop.org and cc me and I will happily review it. Likewise xorg-devel@ for X server changes.
I gently suggest that doing so is a waste of time.
But if there's one thing free software has taught me, it's that you can not tell people something is a bad idea and have any expectation they will believe you.
Obviously in a multi-screen set-up using Xinerama this has the potential to be a Hard Problem if the monitors differ greatly in their DPI.
If the major resistance is over what to do with older hardware that doesn't have this data available, then yes, punt; use a hard-coded default. Likewise, if the two monitors really differ greatly, then punt.
I'm going to limit myself to observing that "greatly" is a matter of opinion, and that in order to be really useful you'd need some way of communicating "I punted" to the desktop.
Beyond that, sure, pick a heuristic, accept that it's going to be insufficient for someone, and then sit back and wait to get second-guessed on it over and over.
And it wouldn't be so hard to to add something like -dpi:0, -dpi:1, -dpi:2 command line options to specify per-screen dpi. I kinda thought I did that a long, long time ago, but maybe I only thought about doing it and never actually got around to it.
The RANDR extension as of version 1.2 does allow you to override physical size on a per-output basis at runtime. We even try pretty hard to set them as honestly as we can up front. The 96dpi thing people complain about is from the per-screen info, which is simply a default because of all the tl;dr above; because you have N outputs per screen which means a single number is in general useless; and because there is no way to refresh the per-screen info at runtime, as it's only ever sent in the initial connection handshake.
- ajax
-- devel mailing list devel@lists.fedoraproject.org https://admin.fedoraproject.org/mailman/listinfo/devel
On 10/04/2011 01:24 PM, Adam Jackson wrote:
I am clearly going to have to explain this one more time, forever. Let's see if I can't write it authoritatively once and simply answer with a URL from here out. (As always, use of the second person "you" herein is plural, not singular.)
....
Thanks for the explanation. This make me remember when everyone was using CRT monitors. There wasn't a way to know from the hardware the monitor refresh rates, so being careful, OSs defaulted to the lowest setting. Why isn't possible to use the 96dpi hardcoded value and provide an UI to that shows the hardware provided values? (or obtained using those heuristics that sometimes fails), provide an UI action to try it and revert if you do not like the results or 10 seconds without an answer. At least trying a different DPI setting is not dangerous to the hardware than trying a bigger refresh rate on old CRT monitors
Ordinary users don't care about DPI any more than they do about what number point or pixel size their favorite font size is.
Why can't something akin to http://people.gnome.org/~federico/news-2007-01.html be employed so that no one gets initialized or stuck with unsuitable sizes?
Snap the result to a multiple of 4, 6, 12 or 24 so that font steps between sizes coordinate well with common scalable font behavior, and for those who desire greater accuracy, offer an advanced opt-out to the snap.
Provide optional inputs for screen width, height and/or diagonal and the under-the-covers DPI might occasionally turn out to match the display density, an ideal result for probably most people.
Focus on getting it suitable for single display users before attacking multis.
Hi,
On Tue, Oct 4, 2011 at 6:54 PM, Adam Jackson ajax@redhat.com wrote:
I am clearly going to have to explain this one more time, forever. Let's see if I can't write it authoritatively once and simply answer with a URL from here out. (As always, use of the second person "you" herein is plural, not singular.)
Thanks for the explanation... There is an alternative to endless explanation - roll out your best effort at a heuristic and let the crowd contribute to an ever growing set of exceptions.
To play the devil's advocate, I'm asking why the monitor situation is different from any other bit of hardware. Drivers for mice, touchpads, wifi, NICs etc all suffer from the same lack of rigorous published specs / documentation, they are supported in Linux, fallibly, by ever growing tables of parameters and heuristics.
-Cam
On Tue, Oct 4, 2011 at 4:03 PM, Camilo Mesias camilo@mesias.co.uk wrote:
Thanks for the explanation... There is an alternative to endless explanation - roll out your best effort at a heuristic and let the crowd contribute to an ever growing set of exceptions.
Well, actually, people complain a lot more than what the code ;-)
To play the devil's advocate, I'm asking why the monitor situation is different from any other bit of hardware.
And he just explained -- fairly well I would say.
On my part, I say thanks Adam -- even being familiar with some of the vagaries of manufacturing data for general hardware, monitor's EDID sounds like an extra-deep nightmare.
For fedora users, as others have mentioned, perhaps a UI that lets users test a couple of possible dpi values might be useful (for those users so inclined). It does have to cross a good chunk of the stack to work well, and seems like a lot of work to get right; but the xrandr improvements are a start.
For distributors -- such as OLPC -- that are know what HW they are shipping, it is important to be able to override the guesswork and state /this/ is my dpi. As far as I can see, Daniel has a way to do it -- in other cases (ie: mozilla's xulrunner) we've had to patch some versions so that they'd accept a configured dpi.
cheers,
m
On Tue, Oct 04, 2011 at 04:17:08PM -0400, Martin Langhoff wrote:
For fedora users, as others have mentioned, perhaps a UI that lets users test a couple of possible dpi values might be useful (for those users so inclined). It does have to cross a good chunk of the stack to work well, and seems like a lot of work to get right; but the xrandr improvements are a start.
Windows used to have a gui that would show a ruler on your monitor and say "hold a real ruler up to this and slide the slider until its the same size." Given what's been said about how windows handles DPI I can only wonder what it did, but it might be a nice thing to have.
--CJD
On Tue, 2011-10-04 at 16:24 -0400, Casey Dahlin wrote:
On Tue, Oct 04, 2011 at 04:17:08PM -0400, Martin Langhoff wrote:
For fedora users, as others have mentioned, perhaps a UI that lets users test a couple of possible dpi values might be useful (for those users so inclined). It does have to cross a good chunk of the stack to work well, and seems like a lot of work to get right; but the xrandr improvements are a start.
Windows used to have a gui that would show a ruler on your monitor and say "hold a real ruler up to this and slide the slider until its the same size." Given what's been said about how windows handles DPI I can only wonder what it did, but it might be a nice thing to have.
I think it was more some specific app that did that, wasn't it? I'm almost sure it was either Paint Shop Pro or the GIMP, because obviously, actual physical accuracy is quite important there. Otherwise it was something like Office. It was definitely some specific app where WYSIWYG was important, not an OS.
On Tuesday, October 04, 2011 10:08:33 PM Adam Williamson wrote:
Windows used to have a gui that would show a ruler on your monitor and say "hold a real ruler up to this and slide the slider until its the same size." Given what's been said about how windows handles DPI I can only wonder what it did, but it might be a nice thing to have.
I think it was more some specific app that did that, wasn't it? I'm almost sure it was either Paint Shop Pro or the GIMP, because obviously, actual physical accuracy is quite important there. Otherwise it was something like Office. It was definitely some specific app where WYSIWYG was important, not an OS.
It makes sense to do that when configuring a desktop environment like Gnome or KDE.
Regards, J
Adam Williamson wrote on Tue, Oct 04, 2011 at 07:08:33PM -0700:
On Tue, 2011-10-04 at 16:24 -0400, Casey Dahlin wrote:
On Tue, Oct 04, 2011 at 04:17:08PM -0400, Martin Langhoff wrote:
For fedora users, as others have mentioned, perhaps a UI that lets users test a couple of possible dpi values might be useful (for those users so inclined). It does have to cross a good chunk of the stack to work well, and seems like a lot of work to get right; but the xrandr improvements are a start.
Windows used to have a gui that would show a ruler on your monitor and say "hold a real ruler up to this and slide the slider until its the same size." Given what's been said about how windows handles DPI I can only wonder what it did, but it might be a nice thing to have.
I think it was more some specific app that did that, wasn't it? I'm almost sure it was either Paint Shop Pro or the GIMP, because obviously, actual physical accuracy is quite important there. Otherwise it was something like Office. It was definitely some specific app where WYSIWYG was important, not an OS.
A specific app may have done it as well, but before Vista, the DPI settings dialog box in the Control Panel had that feature.
-mat
On Wed, 2011-10-05 at 10:49 -0500, Matyas Selmeci wrote:
Windows used to have a gui that would show a ruler on your monitor and say "hold a real ruler up to this and slide the slider until its the same size." Given what's been said about how windows handles DPI I can only wonder what it did, but it might be a nice thing to have.
I think it was more some specific app that did that, wasn't it? I'm almost sure it was either Paint Shop Pro or the GIMP, because obviously, actual physical accuracy is quite important there. Otherwise it was something like Office. It was definitely some specific app where WYSIWYG was important, not an OS.
A specific app may have done it as well, but before Vista, the DPI settings dialog box in the Control Panel had that feature.
ah, OK. thanks for the reminder.
On Tue, 2011-10-04 at 21:03 +0100, Camilo Mesias wrote:
Hi,
On Tue, Oct 4, 2011 at 6:54 PM, Adam Jackson ajax@redhat.com wrote:
I am clearly going to have to explain this one more time, forever. Let's see if I can't write it authoritatively once and simply answer with a URL from here out. (As always, use of the second person "you" herein is plural, not singular.)
Thanks for the explanation... There is an alternative to endless explanation - roll out your best effort at a heuristic and let the crowd contribute to an ever growing set of exceptions.
I think you missed the part where I said I already had done so, that you're already running them, and that I take patches.
I think building the giant database for DPI is a losing battle, and I don't intend to work on it myself. The bright line for the kernel's current quirks list has so far been that we take quirks for fixing mode setup, only. Ancillary data like physical size just isn't something the kernel needs to know.
But if people do insist on it, there's some points of implementation that really should be considered, and I'm happy to discuss them. Overriding EDID is a subtle problem once you get past wanting to make just one permanently-connected display work on one machine. If the future being designed looks like "play with this complicated expert tool until it works for you" then that's not really finished solving the problem. The next person who uses a sufficiently similar monitor with the same set of EDID problems should never have to touch that tool.
How people use that information is entirely not my concern. I have an opinion and it's probably wrong in some cases and I am neither advocating nor defending any such choices here. I'm just here to tell you that the hardware _is_ out to get you, and that the current behaviour is in fact a considered choice and not simply willful malice.
- ajax
On Tue, 2011-10-04 at 13:54 -0400, Adam Jackson wrote:
I'm going to limit myself to observing that "greatly" is a matter of opinion, and that in order to be really useful you'd need some way of communicating "I punted" to the desktop.
Beyond that, sure, pick a heuristic, accept that it's going to be insufficient for someone, and then sit back and wait to get second-guessed on it over and over.
All this is interesting, but it basically consists of a long list of reasons why the EDID info isn't always correct.
96dpi, however, is almost *never* correct, is it? So just taking a hardcoded number that Microsoft happened to pick a decade ago is hardly improving matters.
It still seems to me that taking the EDID number if it seems reasonably plausible and falling back to 96dpi otherwise is likely a better option.
Your examples lean a lot on TVs and projectors, but are those really the key use cases we have to consider? What about laptops and especially tablets, whose resolutions are gradually moving upwards (in the laptop case despite the underlying software problems, in the tablet case because the underlying software doesn't have such a problem)? Is it really a great idea, for instance, if we put Fedora 17 on a 1024x600, 7" tablet and it comes up with zonking huge fonts all over the place?
I think it's worth considering that, even though Microsoft's crappiness with resolution independence has probably hindered the market artificially for a while, the 96dpi number which comes from the capabilities of CRT tubes circa 1995 bears increasingly little resemblance to the capabilities of modern displays, and assuming we can just keep hardcoding 96dpi and monitor technology will remain artificially retarded forever is likely not a great thing to do.
On Tue, 2011-10-04 at 19:05 -0700, Adam Williamson wrote:
Is it really a great idea, for instance, if we put Fedora 17 on a 1024x600, 7" tablet and it comes up with zonking huge fonts all over the place?
Er - s/zonking huge/ridiculously tiny/, of course.
On Tue, 2011-10-04 at 19:05 -0700, Adam Williamson wrote:
96dpi, however, is almost *never* correct, is it? So just taking a hardcoded number that Microsoft happened to pick a decade ago is hardly improving matters.
The X default used to be 72dpi. Maybe it'll be something else in the future, and then I can get bitched at more for having changed it yet again by people still using a fundamentally unreliable API.
It still seems to me that taking the EDID number if it seems reasonably plausible and falling back to 96dpi otherwise is likely a better option.
I reiterate: X gives you the actual sizes (as best as we can guess) on the RANDR outputs. The global "size" that we default to 96dpi is broken to rely on in any event, because X simply has no mechanism for updating it besides reconnecting to the display.
We could add a request to re-fetch the connection handshake block, but if you're going to update all your apps to use that request, you might as well update all your apps to use the existing RANDR's geometry information instead.
If the UI wants to be sensitive to DPI, then do me the favor of using the DPI numbers that map 1:1 to actual monitors, instead of a single number that can never be an accurate reflection of reality.
Your examples lean a lot on TVs and projectors, but are those really the key use cases we have to consider? What about laptops and especially tablets, whose resolutions are gradually moving upwards (in the laptop case despite the underlying software problems, in the tablet case because the underlying software doesn't have such a problem)? Is it really a great idea, for instance, if we put Fedora 17 on a 1024x600, 7" tablet and it comes up with zonking huge fonts all over the place?
I'm going to not mention the traditional monitors I've seen with bad EDID. I'm going to not mention the laptops I've seen that report 0x0 physical size, or something non-zero and fictitious. I'm going to not mention the laptops where you simply don't get EDID, you get some subset buried in the video ROM, and you get to hope that it might have physical size encoded in it. I'm going to not mention that DPI is only approximately what you want anyway, and that you actually need to know dots per unit arc, which is a function of both display size and view distance.
I'm going to simply quote myself from another message in this thread: How people use this information is entirely not my concern. My job is to get the pixels on the screen; it might be to try valiantly to tell you how big they are; it is not to decide if they're big enough.
I think it's worth considering that, even though Microsoft's crappiness with resolution independence has probably hindered the market artificially for a while, the 96dpi number which comes from the capabilities of CRT tubes circa 1995 bears increasingly little resemblance to the capabilities of modern displays, and assuming we can just keep hardcoding 96dpi and monitor technology will remain artificially retarded forever is likely not a great thing to do.
I don't believe that was a position I was defending.
I would caution you against thinking that there's some DPI revolution right around the corner. That's the same fallacy that rages against the TV industry for "stalling" at 1080p. Linear increases in DPI are quadratic increases in link bandwidth, and maxed-out single-link DVI (the source of the 1080p limit) is already a higher symbol rate than gigabit ethernet.
- ajax
On Wed, 2011-10-05 at 10:30 -0400, Adam Jackson wrote:
On Tue, 2011-10-04 at 19:05 -0700, Adam Williamson wrote:
96dpi, however, is almost *never* correct, is it? So just taking a hardcoded number that Microsoft happened to pick a decade ago is hardly improving matters.
The X default used to be 72dpi. Maybe it'll be something else in the future, and then I can get bitched at more for having changed it yet again by people still using a fundamentally unreliable API.
That does seem like the most likely fudge that'll happen, yes: we'll probably wind up with three 'standard' DPIs (say 96, 200, and 300), all hardware built to approximate one of these, and computers that only have to guess which one is right.
It still seems to me that taking the EDID number if it seems reasonably plausible and falling back to 96dpi otherwise is likely a better option.
I reiterate: X gives you the actual sizes (as best as we can guess) on the RANDR outputs. The global "size" that we default to 96dpi is broken to rely on in any event, because X simply has no mechanism for updating it besides reconnecting to the display.
We started this thread off talking about GNOME, not X. I'm still thinking about GNOME, not X, as the thing that effectively hardcodes 96dpi. It has the option to get the 'correct' (probably) DPI from X, but chooses not to.
Your examples lean a lot on TVs and projectors, but are those really the key use cases we have to consider? What about laptops and especially tablets, whose resolutions are gradually moving upwards (in the laptop case despite the underlying software problems, in the tablet case because the underlying software doesn't have such a problem)? Is it really a great idea, for instance, if we put Fedora 17 on a 1024x600, 7" tablet and it comes up with zonking huge fonts all over the place?
I'm going to not mention the traditional monitors I've seen with bad EDID. I'm going to not mention the laptops I've seen that report 0x0 physical size, or something non-zero and fictitious. I'm going to not mention the laptops where you simply don't get EDID, you get some subset buried in the video ROM, and you get to hope that it might have physical size encoded in it.
You just did, sorry. ;) Hardware sucks. We know this. Fedora generally takes the position that it's correct to engineer things properly and regretfully explain that the hardware sucks when this causes problems, not engineer hacks and bodges to account for broken hardware.
I'm going to not mention that DPI is only approximately what you want anyway, and that you actually need to know dots per unit arc, which is a function of both display size and view distance.
Yeah, that's the fudgiest part, and why laptops can get away with going as high as 150dpi (though I *do* quite frequently see people with 1366x768 or even 1600x900 laptops using them at 1024x768...headdesk). I like to deal with that problem by not thinking about it too hard. ;) Ironically, though, it's a reason not to worry about the TV case too much, because TVs tend to have a sort of 'standard' dots per unit arc - if you know that what you're dealing with is a TV you can make some reasonably safe assumptions about how big you should paint stuff.
I'm going to simply quote myself from another message in this thread: How people use this information is entirely not my concern. My job is to get the pixels on the screen; it might be to try valiantly to tell you how big they are; it is not to decide if they're big enough.
Sure. I was not directing my message entirely at you personally, but at the question of whether it's a good idea for GNOME to simply say '96dpi is it'.
I would caution you against thinking that there's some DPI revolution right around the corner. That's the same fallacy that rages against the TV industry for "stalling" at 1080p. Linear increases in DPI are quadratic increases in link bandwidth, and maxed-out single-link DVI (the source of the 1080p limit) is already a higher symbol rate than gigabit ethernet.
I actually think there is; TV is not 'stalled' at 1080p, there's clear moves towards 4K (especially since, with 3D more or less tanking, it's the next Great White Hope of TV manufacturers to get people to replace their HDTVs). The 96dpi number has probably survived so long because it happens to approximate what you get when you display 'HD' resolutions on the size of monitor most people are happy to have on their desks - 720p at 19-20", 1080p at 22-24" or so. Once 4K gets some market traction, some marketing genius somewhere is going to realise there's money in them thar hills - the first mover to sell a 4K, 22" monitor is going to have a nice selling point that's easily understandable by consumers. That's almost exactly 200dpi - my second 'magic density'. No-one's going to get very far trying to sell people 45" desktop monitors - the kind of size you'd need to get back down to ~100dpi at 4K resolutions. I don't think the IC problem is going to hold anyone up for very long, there's a new HDMI revision every Wednesday, seems like...
On Wed, Oct 05, 2011 at 09:26:59AM -0700, Adam Williamson wrote:
You just did, sorry. ;) Hardware sucks. We know this. Fedora generally takes the position that it's correct to engineer things properly and regretfully explain that the hardware sucks when this causes problems, not engineer hacks and bodges to account for broken hardware.
Really? Because honestly if that's the position we generally hold then I'm just closing most of my bugs WONTFIX from here on out.
There are multiple issues here. The first is that X reports a 96dpi value because X can only report one value, so it might as well pick something that at least roughly matches user expectations. But like Adam says, randr gives you the per head measurements and you could work things out from there.
But then things get awkward. That information is often lies. So we can add a heuristic that clamps the DPI to something in-between 72 and 250 and we probably won't exclude any real displays right now, but we do know that even by absolute standards some people are suddenly going to have tiny fonts and some people are suddenly going to have huge fonts because the hardware lies. But we'll write that off as broken hardware.
(You've also changed expected behaviour, because lots of people *want* small fonts on high-DPI screens, but again let's just chalk that up to incorrect expectations and make sure there's an easy UI that lets them change their global font size)
So, ok, now you have some belief about the DPI. But which DPI? If you're dual head, you've got two. Unless they match you're screwed - there's no magic way to get applications to reflow text just because you've moved the window between screens, and what would you do with a window that's halfway between? You can argue that this is a corner case and obviously yes it's a corner case but if you can't even pretend to fix the corner case then your solution isn't a solution any more than 96dpi is.
But what about the single monitor case? Let's go back to your Vaio. It's got a high DPI screen, so let's adjust to that. Now you're happy. Right up until you plug in an external monitor and now when you run any applications on the external display your fonts are twice the size they should be. WOOHOO GO TEAM of course that won't make us look like amateurs at all. So you need another heuristic to handle that, and of course "heuristic" is an ancient african word meaning "maybe bonghits will make this problem more tractable".
We have no technological solution for dealing with the fact that applications may move from one DPI to another at runtime, and may even be displaying on both displays at once. All of which doesn't matter, of course, because we don't even have a well-defined problem statement. What are we actually trying to solve here?
Honestly, it's valuable for applications to be able to identify the DPI of the screen they're running on. For certain design purposes it may well be helpful for an application to have "100%" map to "this is what a sheet of paper the same distance away would look like", and so the fact that this is available to applications is a good thing.
But is it valuable for "My fonts and icons look different on different displays"? Sure, if you only ever use a single display, which is no longer the ubiquitous situation that it used to be. Or, in other words, no. It's not valuable.
How about "My fonts are too small on my high-DPI laptop"? Well, yes, that's a problem. And we should ensure that there's a usable way for you to fix that. But really in that situation my first port of call would be to search the font settings for a button that says "Make my fonts bigger", not to look in display settings for something that lets me drag a bar across the screen to line up with a ruler.
In summary: Accurate DPI measurement is a means to an end, not an end in itself. Define the problem you're trying to solve, and then work out whether figuring out the real DPI would solve it. Unless your problem statement is unrealistically narrow, the answer is that it wouldn't.
On Wed, 2011-10-05 at 18:49 +0100, Matthew Garrett wrote:
So, ok, now you have some belief about the DPI. But which DPI? If you're dual head, you've got two. Unless they match you're screwed - there's no magic way to get applications to reflow text just because you've moved the window between screens, and what would you do with a window that's halfway between? You can argue that this is a corner case and obviously yes it's a corner case but if you can't even pretend to fix the corner case then your solution isn't a solution any more than 96dpi is.
There's no _magic_ way to fix anything, no. Things get fixed by code writers writing code. That would seem to be the obvious thing to do...
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
On Wed, 2011-10-05 at 12:31 -0700, Adam Williamson wrote:
On Wed, 2011-10-05 at 18:49 +0100, Matthew Garrett wrote:
So, ok, now you have some belief about the DPI. But which DPI? If you're dual head, you've got two. Unless they match you're screwed - there's no magic way to get applications to reflow text just because you've moved the window between screens, and what would you do with a window that's halfway between? You can argue that this is a corner case and obviously yes it's a corner case but if you can't even pretend to fix the corner case then your solution isn't a solution any more than 96dpi is.
There's no _magic_ way to fix anything, no. Things get fixed by code writers writing code. That would seem to be the obvious thing to do...
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
Simo.
On Wed, 2011-10-05 at 15:44 -0400, Simo Sorce wrote:
On Wed, 2011-10-05 at 12:31 -0700, Adam Williamson wrote:
On Wed, 2011-10-05 at 18:49 +0100, Matthew Garrett wrote:
So, ok, now you have some belief about the DPI. But which DPI? If you're dual head, you've got two. Unless they match you're screwed - there's no magic way to get applications to reflow text just because you've moved the window between screens, and what would you do with a window that's halfway between? You can argue that this is a corner case and obviously yes it's a corner case but if you can't even pretend to fix the corner case then your solution isn't a solution any more than 96dpi is.
There's no _magic_ way to fix anything, no. Things get fixed by code writers writing code. That would seem to be the obvious thing to do...
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
If they're sufficiently different in DPI, sure. Or would you really want everything to suddenly become twice as small if you were moving a window from a 100dpi monitor to a 200dpi one?
On Wed, 2011-10-05 at 12:49 -0700, Adam Williamson wrote:
On Wed, 2011-10-05 at 15:44 -0400, Simo Sorce wrote:
On Wed, 2011-10-05 at 12:31 -0700, Adam Williamson wrote:
On Wed, 2011-10-05 at 18:49 +0100, Matthew Garrett wrote:
So, ok, now you have some belief about the DPI. But which DPI? If you're dual head, you've got two. Unless they match you're screwed - there's no magic way to get applications to reflow text just because you've moved the window between screens, and what would you do with a window that's halfway between? You can argue that this is a corner case and obviously yes it's a corner case but if you can't even pretend to fix the corner case then your solution isn't a solution any more than 96dpi is.
There's no _magic_ way to fix anything, no. Things get fixed by code writers writing code. That would seem to be the obvious thing to do...
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
If they're sufficiently different in DPI, sure. Or would you really want everything to suddenly become twice as small if you were moving a window from a 100dpi monitor to a 200dpi one?
Are you also proposing to automatically resize all windows when you move them from one display to another ? There lies the road to disaster an pain imo.
At least untill all rendering is done with something like svg and not with absolute pixel values this is just going to be a very bad experience.
Simo.
On Wed, 2011-10-05 at 15:56 -0400, Simo Sorce wrote:
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
If they're sufficiently different in DPI, sure. Or would you really want everything to suddenly become twice as small if you were moving a window from a 100dpi monitor to a 200dpi one?
Are you also proposing to automatically resize all windows when you move them from one display to another ? There lies the road to disaster an pain imo.
At least untill all rendering is done with something like svg and not with absolute pixel values this is just going to be a very bad experience.
I'm more trying to think ahead about what's going to happen when our current convenient assumptions break down than making specific technical proposals. It just doesn't seem to me like a winning strategy to keep working on the basis that we can simply assume one notional DPI for all displays; sooner or later, given where display technology is going, this is likely to break down. (Unless we just go with what happened when we switched from 72dpi to 96dpi, I guess: wait until some arbitrary 'tipping point' in the adoption of hi-res displays and then say 'okay, new notional dpi is 200, get used to it'. But there may be too long of an overlap period for that to be practical.)
Dne 5.10.2011 21:56, Simo Sorce napsal(a):
On Wed, 2011-10-05 at 12:49 -0700, Adam Williamson wrote:
On Wed, 2011-10-05 at 15:44 -0400, Simo Sorce wrote:
On Wed, 2011-10-05 at 12:31 -0700, Adam Williamson wrote:
On Wed, 2011-10-05 at 18:49 +0100, Matthew Garrett wrote:
So, ok, now you have some belief about the DPI. But which DPI? If you're dual head, you've got two. Unless they match you're screwed - there's no magic way to get applications to reflow text just because you've moved the window between screens, and what would you do with a window that's halfway between? You can argue that this is a corner case and obviously yes it's a corner case but if you can't even pretend to fix the corner case then your solution isn't a solution any more than 96dpi is.
There's no _magic_ way to fix anything, no. Things get fixed by code writers writing code. That would seem to be the obvious thing to do...
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
If they're sufficiently different in DPI, sure. Or would you really want everything to suddenly become twice as small if you were moving a window from a 100dpi monitor to a 200dpi one?
Are you also proposing to automatically resize all windows when you move them from one display to another ? There lies the road to disaster an pain imo.
At least untill all rendering is done with something like svg and not with absolute pixel values this is just going to be a very bad experience.
Simo.
Actually you should think about it in opposite way. Window will have always the same size, e.g. 10x10 cm and who cares how many pixel it is? Have you ever counted? Pixels are so ancient ...
Vit
Le Mer 5 octobre 2011 21:56, Simo Sorce a écrit :
At least untill all rendering is done with something like svg and not with absolute pixel values this is just going to be a very bad experience.
How is rendering ever going to go be done with something like svg when no one bothers with the elements which have been available in vector formats since last millenium? You have to start somewhere.
Le Mer 5 octobre 2011 21:44, Simo Sorce a écrit :
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
Unfortunately, when you get into situations with more than 150% difference in pixel densities between displays (as we've been creeping towards in the last decade) that's the only way to display text the user will be able to read.
You can check it now easily, just get a run-of-the-mill full-hd 15" laptop (not even a tiny netbook), a run-of-the-mill 22" or more screen (nothing especially uncommon either), create an extended desktop with both screens and try to set a satisfying font size. I defy you to find a setting that won't look way too small or way too big on one of the screens. And it won't matter if the user likes small or big fonts.
On Thu, 2011-10-06 at 13:06 +0200, Nicolas Mailhot wrote:
Le Mer 5 octobre 2011 21:44, Simo Sorce a écrit :
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
Unfortunately, when you get into situations with more than 150% difference in pixel densities between displays (as we've been creeping towards in the last decade) that's the only way to display text the user will be able to read.
You can check it now easily, just get a run-of-the-mill full-hd 15" laptop (not even a tiny netbook), a run-of-the-mill 22" or more screen (nothing especially uncommon either), create an extended desktop with both screens and try to set a satisfying font size. I defy you to find a setting that won't look way too small or way too big on one of the screens. And it won't matter if the user likes small or big fonts.
Nicolas I am aware of the issue, but I am also aware of the technical difficulties in doing something like that. It's not possible today and I am not sure it will be in the near future.
So currently the only option is to tell the user that we do not support multiple displays where pixel density varies by moire than 10% between them.
I would even go as far as saying that by default gnome should refuse to let you join together screens of so high difference in density except we cannot trust the HW info apparently, so all we are left with is a bad user experience.
Simo.
Em Qui, 2011-10-06 às 08:21 -0400, Simo Sorce escreveu:
On Thu, 2011-10-06 at 13:06 +0200, Nicolas Mailhot wrote:
Le Mer 5 octobre 2011 21:44, Simo Sorce a écrit :
Are you saying fonts should change on the fly when I move an app between 2 monitors that have different DPIs ?
Unfortunately, when you get into situations with more than 150% difference in pixel densities between displays (as we've been creeping towards in the last decade) that's the only way to display text the user will be able to read.
You can check it now easily, just get a run-of-the-mill full-hd 15" laptop (not even a tiny netbook), a run-of-the-mill 22" or more screen (nothing especially uncommon either), create an extended desktop with both screens and try to set a satisfying font size. I defy you to find a setting that won't look way too small or way too big on one of the screens. And it won't matter if the user likes small or big fonts.
Nicolas I am aware of the issue, but I am also aware of the technical difficulties in doing something like that. It's not possible today and I am not sure it will be in the near future.
So currently the only option is to tell the user that we do not support multiple displays where pixel density varies by moire than 10% between them.
I would even go as far as saying that by default gnome should refuse to let you join together screens of so high difference in density except we cannot trust the HW info apparently, so all we are left with is a bad user experience.
Please don't. I extend my 11" laptop on a 32" TV and despite poor readability of regular fonts it still works just fine for what I need - movie playback, photo viewing, PDF presentations, etc.
-- Evandro
On 10/05/11 12:31, Adam Williamson wrote:
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't_matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Good point Adam. Even if the Xserver correctly intuits the resolution of each display, application behavior is going to be unacceptable. Consider dragging a window from a 200 dpi display to a 100 dpi display. Does the application detect this and correctly re-scale it's window and interior widgets? If the Xserver re-scales the font, how does the app detect the change in bounding box pixel geometry? How's the app supposed to behave if the window straddles monitors?
Unless all the graphic toolkits are significantly redesigned, there's no nice way to operate in Xinerama with mis-matched montiors.
Adam Williamson awilliam@redhat.com writes:
[...] But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Perhaps the solution is to bring back Zaphod / classic-multi-head mode for dramatically different (e.g., DPI) outputs, so the issue of dragging windows between them does not arise.
- FChE
On Wed, Oct 05, 2011 at 12:31:50PM -0700, Adam Williamson wrote:
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Sure, in the future when we have font renderers that run in GPU shaders we can think about whether there's a plausible way to make applications work when they have to deal with multiple DPIs simultaneously. But we don't have any technology that can do any of that at the moment, and so the simple fact is that right now the decision to have gnome run at 96dpi regardless of the output is an entirely rational one and anyone who argues otherwise gets to explain how all the difficult bits would work. The end.
On Wed, 2011-10-05 at 21:31 +0100, Matthew Garrett wrote:
On Wed, Oct 05, 2011 at 12:31:50PM -0700, Adam Williamson wrote:
Like I replied to ajax, I suspect when the problem of assuming everything's 96dpi becomes simply too acute, instead of fixing everything really properly so that all displays correct report their size and all desktops actually do resolution independence perfectly so it doesn't _matter_ if one of your displays is 98dpi and the other is 215dpi, everything still looks perfect, the industry will just wind up with a slightly more sophisticated bodge where we have a few 'standard' resolutions and just figure out which one your displays are closest to. But that's still going to require some kind of sensible handling of the case where one monitor is roughly 100dpi and the other is roughly 200dpi, unless we simply say 'you can't do that, all your displays have to be in the same DPI Category'.
Sure, in the future when we have font renderers that run in GPU shaders we can think about whether there's a plausible way to make applications work when they have to deal with multiple DPIs simultaneously. But we don't have any technology that can do any of that at the moment, and so the simple fact is that right now the decision to have gnome run at 96dpi regardless of the output is an entirely rational one and anyone who argues otherwise gets to explain how all the difficult bits would work. The end.
I'm just saying it would probably pay off to put some thought *now* into how to manage things when higher resolution displays become so prevalent that they can't be ignored, rather than desperately scrambling to catch up when you eventually realize it's happened.
On Wed, Oct 05, 2011 at 01:34:43PM -0700, Adam Williamson wrote:
I'm just saying it would probably pay off to put some thought *now* into how to manage things when higher resolution displays become so prevalent that they can't be ignored, rather than desperately scrambling to catch up when you eventually realize it's happened.
The likely outcome of higher density displays is that default font sizes will get larger. It's a problem if and only if it's common to use multiple displays of grossly different density, and fixing that problem is impossible unless we have a huge number of technical advances that nobody's even working on right now. It's worth thinking about. It's just not something that we're anywhere near being able to implement, and as such it's pretty unrelated to the original observation which is that trusting EDID right now will just get you burned.
Matthew Garrett mjg59@srcf.ucam.org writes:
We have no technological solution for dealing with the fact that applications may move from one DPI to another at runtime, and may even be displaying on both displays at once.
From a technology viewpoint, that is actually theoretically easy to
handle on modern hardware: Render everything as 3D objects and let the graphics hardware scale as appropriate.
To get it to look pretty you would need fairly high DPI monitors or fancy scaling algorithms though. I can imagine that sub-pixel rendering would be quite tricky to get right when DPI changes halfway through a character.
/Benny
On Wed, Oct 05, 2011 at 11:11:38PM +0200, Benny Amorsen wrote:
Matthew Garrett mjg59@srcf.ucam.org writes:
We have no technological solution for dealing with the fact that applications may move from one DPI to another at runtime, and may even be displaying on both displays at once.
From a technology viewpoint, that is actually theoretically easy to
handle on modern hardware: Render everything as 3D objects and let the graphics hardware scale as appropriate.
This... works badly. Really. Open gimp and add some text. Now double the size of the font. Save the image and open it in image viewer, and zoom out so the text is half the size. It doesn't look the same as your original text.
Rendering fonts (and even SVGs) well requires you to know the scale that you're rendering to. More pixels mean you can add more detail. If you shrink that then the additional detail is still there, getting in the way of the actually important information. Doing this properly requires that the original object renderer be part of the scaling process, and doing that on the fly with reasonable performance just isn't part of our rendering stack at the moment.
Le Mer 5 octobre 2011 23:35, Matthew Garrett a écrit :
This... works badly. Really. Open gimp and add some text. Now double the size of the font. Save the image and open it in image viewer, and zoom out so the text is half the size. It doesn't look the same as your original text.
Rendering fonts (and even SVGs) well requires you to know the scale that you're rendering to. More pixels mean you can add more detail. If you shrink that then the additional detail is still there, getting in the way of the actually important information. Doing this properly requires that the original object renderer be part of the scaling process, and doing that on the fly with reasonable performance just isn't part of our rendering stack at the moment.
Which is exactly why forcing 96dpi on displays which have very different pixel densities *today* is not a good idea at all.
On Thu, 2011-10-06 at 13:13 +0200, Nicolas Mailhot wrote:
Le Mer 5 octobre 2011 23:35, Matthew Garrett a écrit :
This... works badly. Really. Open gimp and add some text. Now double the size of the font. Save the image and open it in image viewer, and zoom out so the text is half the size. It doesn't look the same as your original text.
Rendering fonts (and even SVGs) well requires you to know the scale that you're rendering to. More pixels mean you can add more detail. If you shrink that then the additional detail is still there, getting in the way of the actually important information. Doing this properly requires that the original object renderer be part of the scaling process, and doing that on the fly with reasonable performance just isn't part of our rendering stack at the moment.
Which is exactly why forcing 96dpi on displays which have very different pixel densities *today* is not a good idea at all.
Non sequitur.
Simo.
On Thu, Oct 06, 2011 at 01:13:21PM +0200, Nicolas Mailhot wrote:
Le Mer 5 octobre 2011 23:35, Matthew Garrett a écrit :
This... works badly. Really. Open gimp and add some text. Now double the size of the font. Save the image and open it in image viewer, and zoom out so the text is half the size. It doesn't look the same as your original text.
Rendering fonts (and even SVGs) well requires you to know the scale that you're rendering to. More pixels mean you can add more detail. If you shrink that then the additional detail is still there, getting in the way of the actually important information. Doing this properly requires that the original object renderer be part of the scaling process, and doing that on the fly with reasonable performance just isn't part of our rendering stack at the moment.
Which is exactly why forcing 96dpi on displays which have very different pixel densities *today* is not a good idea at all.
Knowing the number of pixels available means that the output will be legible, even if you'd prefer it to be a different size. Rescaling after rendering means that the output will be illegible, even if it's the correct size. Given that we don't have the ability to dynamically re-render everything the moment an application is moved between screens, what's your proposed solution?
Le Jeu 6 octobre 2011 15:37, Matthew Garrett a écrit :
On Thu, Oct 06, 2011 at 01:13:21PM +0200, Nicolas Mailhot wrote:
Le Mer 5 octobre 2011 23:35, Matthew Garrett a écrit :
This... works badly. Really. Open gimp and add some text. Now double the size of the font. Save the image and open it in image viewer, and zoom out so the text is half the size. It doesn't look the same as your original text.
Rendering fonts (and even SVGs) well requires you to know the scale that you're rendering to. More pixels mean you can add more detail. If you shrink that then the additional detail is still there, getting in the way of the actually important information. Doing this properly requires that the original object renderer be part of the scaling process, and doing that on the fly with reasonable performance just isn't part of our rendering stack at the moment.
Which is exactly why forcing 96dpi on displays which have very different pixel densities *today* is not a good idea at all.
Knowing the number of pixels available means that the output will be legible, even if you'd prefer it to be a different size.
I don't call text which is significantly too big or too small legible.
When apps that use different toolkits perform different font size adjustments, the resulting UIs are inconsistent and generally tiring (uniformity is a huge factor in text readability)
When basic font sizes are out of whack because the desktop pretends the pixel density is way different than it is really users try to compensate using all the available size settings in their apps. The result is a huge heterogeneous mess. Settings and documents can not be moved from one computer to another without strange font size changes. Different fonts are not adjusted the same way by users, because of size rounding in UIs, breaking font set harmony. Sometimes you get apps that adjust via zooming without re-rendering
On Wed, 2011-10-05 at 23:11 +0200, Benny Amorsen wrote:
Matthew Garrett mjg59@srcf.ucam.org writes:
We have no technological solution for dealing with the fact that applications may move from one DPI to another at runtime, and may even be displaying on both displays at once.
From a technology viewpoint, that is actually theoretically easy to
handle on modern hardware: Render everything as 3D objects and let the graphics hardware scale as appropriate.
Your use of the word "theoretically" reveals much. You would almost certainly be appalled by just how much geometry information is necessary to render a single glyph. Which is why we - and Windows, and OSX - don't do that. When you ask for a glyph at a given transformation matrix, it gets rasterized down to an A8 mask, and we reuse those from then on. (Okay, it's A8R8G8B8 if you're doing subpixel antialiasing). That's the only way you get anything like acceptable performance.
If it were easy, we'd already be doing it.
- ajax
On 10/05/2011 07:49 PM, Matthew Garrett wrote:
But what about the single monitor case? Let's go back to your Vaio. It's got a high DPI screen, so let's adjust to that. Now you're happy. Right up until you plug in an external monitor and now when you run any applications on the external display your fonts are twice the size they should be. WOOHOO GO TEAM of course that won't make us look like amateurs at all. So you need another heuristic to handle that, and of course "heuristic" is an ancient african word meaning "maybe bonghits will make this problem more tractable".
Heh, I don't know about Adam's Vaio, but mine (the now-discontinued 13" Y, a.k.a. Vaio-S-with-ULV-without-optical-drive) has all sorts of strange quirks (e.g. totally broken ACPI backlight interface; Matthew would remember this) -- and it turns out that it did what ajax noted earlier too: xrandr reports a 0x0 physical screen size. *sigh*. So much for quality products.
But maybe a quick 'I know I have a 13.3" widescreen laptop, you know the resolution, just make things work' should work for the single-screen case (esp if we stick to certain target DPIs as Adam suggested). One shouldn't ask the typical user for information that's too cumbersome to use, oui? Like asking them to use a physical ruler to match up against.
On Thu, Oct 06, 2011 at 03:57:38PM +0200, Michel Alexandre Salim wrote:
But maybe a quick 'I know I have a 13.3" widescreen laptop, you know the resolution, just make things work' should work for the single-screen case (esp if we stick to certain target DPIs as Adam suggested). One shouldn't ask the typical user for information that's too cumbersome to use, oui? Like asking them to use a physical ruler to match up against.
Like I said, that works fine right up until the point where you plug in a monitor with a different DPI. What do we do then?
On 10/06/2011 09:45 AM, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 03:57:38PM +0200, Michel Alexandre Salim wrote:
But maybe a quick 'I know I have a 13.3" widescreen laptop, you know the resolution, just make things work' should work for the single-screen case (esp if we stick to certain target DPIs as Adam suggested). One shouldn't ask the typical user for information that's too cumbersome to use, oui? Like asking them to use a physical ruler to match up against.
Like I said, that works fine right up until the point where you plug in a monitor with a different DPI. What do we do then?
Use the same DPI of the main monitor? what is the difference of using the wrong DPI obtained from the main monitor on the external one and using the hardcoded 96 value when the external monitor is not 96dpi. both are wrong without solution, why not at least give the user the option to set the correct DPI for the internal one, using 96 as default
Once upon a time, Matthew Garrett mjg59@srcf.ucam.org said:
Like I said, that works fine right up until the point where you plug in a monitor with a different DPI. What do we do then?
I would wager that the majority of Fedora systems are single monitor (or, in the case of notebooks, single monitor much of the time); can't we at least try to correct for that case first, and _then_ try to deal with multi-monitor setups?
On Thu, Oct 06, 2011 at 09:30:50AM -0500, Chris Adams wrote:
Once upon a time, Matthew Garrett mjg59@srcf.ucam.org said:
Like I said, that works fine right up until the point where you plug in a monitor with a different DPI. What do we do then?
I would wager that the majority of Fedora systems are single monitor (or, in the case of notebooks, single monitor much of the time); can't we at least try to correct for that case first, and _then_ try to deal with multi-monitor setups?
Changing the current behaviour doesn't make the most common case significantly better, but potentially makes a less common (but still common) case significantly worse. What's the benefit?
Le Jeu 6 octobre 2011 16:33, Matthew Garrett a écrit :
On Thu, Oct 06, 2011 at 09:30:50AM -0500, Chris Adams wrote:
Once upon a time, Matthew Garrett mjg59@srcf.ucam.org said:
Like I said, that works fine right up until the point where you plug in a monitor with a different DPI. What do we do then?
I would wager that the majority of Fedora systems are single monitor (or, in the case of notebooks, single monitor much of the time); can't we at least try to correct for that case first, and _then_ try to deal with multi-monitor setups?
Changing the current behaviour doesn't make the most common case significantly better, but potentially makes a less common (but still common) case significantly worse. What's the benefit?
Have the same font size value mean the same thing in gnome and not-gnome apps?
Help people who use several computers calibrate them so they get the same font sizes on all of them without having to remember size 14 means something on one computer and another on others?
Support single-screen high-density screen setups by default?
Make network homes work as long as every client uses a display with working EDID?
Really, this focus on not letting people provide simple display sizes when the EDID is broken¹ is ridiculous. Especially when *at the same time* GNOME 3.2 advertises support for colour correcting displays using ridiculously complex procedures²
As you've already pointed out, the new heuristic does not solve the complex cases. So it's no use enumerating them to justify it. It won't make them go away and they will need handling sooner or later with the heuristic or without it. In the meanwhile all the heuristic does is introduce inconsistencies in font handling between GNOME and everyone else.
¹ Which BTW is not the general case, most EDIDs are fine and have been for years ² Which I support (and indeed have packaged argyll for Fedora in the past) but it is hardly simple
The heuristic isn't the problem. The problem is that we have no technology that allows us to handle the complicated case of multiple displays, and solving it purely for the simple case makes the complicated case *worse*. Adding additional complexity for what would be, at best, a different set of problems doesn't seem like a worthwhile way to spend time. The only people who are actively upset by the status quo are the ones who have the ability to fix it for their case, anyway.
Le Jeu 6 octobre 2011 17:18, Matthew Garrett a écrit :
The heuristic isn't the problem. The problem is that we have no technology that allows us to handle the complicated case of multiple displays, and solving it purely for the simple case makes the complicated case *worse*.
How does it make it worse? The heuristic does not solve the complicated case *at all*. How does removing it could possibly make it worse?
On Thu, Oct 06, 2011 at 05:33:48PM +0200, Nicolas Mailhot wrote:
Le Jeu 6 octobre 2011 17:18, Matthew Garrett a écrit :
The heuristic isn't the problem. The problem is that we have no technology that allows us to handle the complicated case of multiple displays, and solving it purely for the simple case makes the complicated case *worse*.
How does it make it worse? The heuristic does not solve the complicated case *at all*. How does removing it could possibly make it worse?
What heuristic?
Le jeudi 06 octobre 2011 à 16:41 +0100, Matthew Garrett a écrit :
On Thu, Oct 06, 2011 at 05:33:48PM +0200, Nicolas Mailhot wrote:
Le Jeu 6 octobre 2011 17:18, Matthew Garrett a écrit :
The heuristic isn't the problem. The problem is that we have no technology that allows us to handle the complicated case of multiple displays, and solving it purely for the simple case makes the complicated case *worse*.
How does it make it worse? The heuristic does not solve the complicated case *at all*. How does removing it could possibly make it worse?
What heuristic?
The one you were writing about
Me, I don't care about which heuristic you were thinking of. Any heuristic will annoy large numbers of users when you're dealing with text. The rules should be simple and stupid:
A. on a single-screen system 1. use the xorg-detected screen size to compute actual DPI, base font sizes on it 2. if autodetection didn't work or the results looks wrong (because the hardware is broken, or it's not designed to be used on a desk but is a TV/a projector), ask the user to provide the screen size (displaying slider + a ruler in the locale's length unit, with the length unit displayed on screen too; users are smart enough to fake lengths if they want to). If you want market forces to work, crowdsource the complaining and tell the user his hardware is broken and he should take it with the manufacturer. 3. save the results and reuse them each time the same screen is used 4. propagate the resulting dpi so every toolkit can use them (ideally propagate it down to the xorg level so every toolkit that uses xorg dpi will just work)
B. when a second screen is detected 1. use the same rules to get its size 2. if the computed dpi for the second screen is too different from the first one, ask the user what to do (optimize for screen 1, for screen 2, deoptimize both with a middle setting) 3. save the results to apply them automatically the next time this screen combination is encountered 4. longer term start thinking about how to apply different dpi to different outputs as screen densities have been clearly diverging for some years, and the compination of fixed-size laptops and increasing resolutions can only mean more divergence in the future
C. for font sizes 1. display them in points (pt) or pixels (px), 2. display the unit you're using. Don't make the user guess what the perverted font dialog author had in mind 3. let the user specify them in points or pixels as he prefers, regardless of the default display unit 4. accept decimals, do not try to round sizes to integers 5. do not try to invent a new unit. Yes historical units are stupid and badly thought, but so are imperial units and letter format, and that's what some countries still use. Units are not there to be smart units are there so different people can agree on measurements. Points is what users will see in every electronic document they will be filling, no new unit will be better enough to outweight the hassle of the user having to deal with a new unit. If you have time to waste, go rewrite every electronic document app and format in the market to use your unit, and only afterwards inflict it on desktop users. And if you still think being pedantic about font size units is a good idea, try to use only Kelvins when speaking to others about temperatures and see how they react.
On Thu, Oct 06, 2011 at 09:22:22PM +0200, Nicolas Mailhot wrote:
Le jeudi 06 octobre 2011 à 16:41 +0100, Matthew Garrett a écrit :
What heuristic?
The one you were writing about
The heuristic I was writing about is the "Trust the DPI we get from EDID if it's within some size range". We don't implement that. We have no heuristic approach at all.
On 2011/10/06 21:22 (GMT+0200) Nicolas Mailhot composed:
C. for font sizes
- display them in points (pt) or pixels (px),
no
- display the unit you're using. Don't make the user guess what the
perverted font dialog author had in mind
no
- let the user specify them in points or pixels as he prefers,
regardless of the default display unit
no
Instead, display fonts of different sizes and have user pick one. No need for user to know or care about units or numbers or DPI, same as web authors needlessly fixated on pixels instead of the fact that personal computers are expected to be personalized, which means desktop objects larger or smaller than as shipped from the vendor.
DPI is nothing but a way to describe pixel density, and users shouldn't care about numbers, only higher or lower, good or not good, better or not better. Higher DPI _is_ better, because higher means more resolution, more accuracy, and more pleasing viewing, but only as long as naive application and web site stylists don't force to sizes that work only on lower density screens they're viewing during construction.
http://people.gnome.org/~federico/news-2007-01.html
Le jeudi 06 octobre 2011 à 15:39 -0400, Felix Miata a écrit :
Instead, display fonts of different sizes and have user pick one. No need for user to know or care about units or numbers or DPI,
That does not work because those units are used in different electronic formats, for example office suites. You can invent all the new font mesurements you want at the end of the day a lot of users will open libreoffice or another app that requires them to know the size of a 12pt font. All you get by using another unit elsewhere is a new translation table and lots of confused users. (you can replace office suite with blog + custom css that works well on your computer but is hideously small/large when you check your site from you ma's computer)
After all the 'need to hide dpi and units' madness started many users started to use custom font scales on their computer, and got really annoyed when they realised later that: 1. changing their hardware changed the meaning of all the sizes they got used to 2. they could not agree with their friends what sizes meant, and the office documents they had to produce where missized
That's the same reason you need to colour correct your screen if you're serious about photography. In theory your screen colour drift does not mater because you can compensate for it camera-size. In practice the compensation ends up as huge colour mistakes as soon as you try to use the result on something else (another screen, printing the result, does not matter if you don't stick to a common non local baseline things will drift)
On Thu, 2011-10-06 at 16:18 +0100, Matthew Garrett wrote:
The heuristic isn't the problem. The problem is that we have no technology that allows us to handle the complicated case of multiple displays, and solving it purely for the simple case makes the complicated case *worse*. Adding additional complexity for what would be, at best, a different set of problems doesn't seem like a worthwhile way to spend time. The only people who are actively upset by the status quo are the ones who have the ability to fix it for their case, anyway.
I am sure display manager can easily grow a button to say something along the lines of: change font resolution to better fit multiple monitors. so that when someone that has widely varying DPIs between monitors plugs a second monitor in they can press that button and get whatever default you like best for that use case.
Simo.
<flamebait>I am not asking for a slider because I guess the options police would shot it down :-P </flamebait>
On Thu, Oct 06, 2011 at 11:35:08AM -0400, Simo Sorce wrote:
I am sure display manager can easily grow a button to say something along the lines of: change font resolution to better fit multiple monitors. so that when someone that has widely varying DPIs between monitors plugs a second monitor in they can press that button and get whatever default you like best for that use case.
We could do that, but you'd still need toolkit support for triggering a re-render of everything. And it'd be pretty dreadful UI ("why doesn't this just happen automatically?"). And suddenly everything on your internal display would be a different size and possibly even in a different place.
On Thu, 2011-10-06 at 16:44 +0100, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 11:35:08AM -0400, Simo Sorce wrote:
I am sure display manager can easily grow a button to say something along the lines of: change font resolution to better fit multiple monitors. so that when someone that has widely varying DPIs between monitors plugs a second monitor in they can press that button and get whatever default you like best for that use case.
We could do that, but you'd still need toolkit support for triggering a re-render of everything. And it'd be pretty dreadful UI ("why doesn't this just happen automatically?"). And suddenly everything on your internal display would be a different size and possibly even in a different place.
The consequences are exactly the reason why I think it should not happen automatically and a button would be the right compromise.
Avoids WTF surprises if you just plug a monitor in and suddenly all your stuff changes, and still allows you to fix the size if the other monitor is too screwed up with the settings you have due to your main monitor.
In all cases where you have widely different DPIs I am sure you will find 50% of the people wanting the exact opposite oft he other 50% to happen so not doing anything and letting the user "adjust" the situation only if he wants to seem the better way.
Plus IIRC display manger tries to remember settings, so this is something that could be remembered as well so users do not get annoyed with "but I already told it to do that yesterday when I plugged in the video projector the firs time".
That said, I am not responsible for any of these changes, so I will leave it in the hands of the maintainers hoping this discussion have improved everyone understanding of the issues involved.
Simo.
A bit late, but still this thread has been slightly getting on my nerves...
On Thu, 6 Oct 2011 16:44:40 +0100 Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 11:35:08AM -0400, Simo Sorce wrote:
I am sure display manager can easily grow a button to say something along the lines of: change font resolution to better fit multiple monitors. so that when someone that has widely varying DPIs between monitors plugs a second monitor in they can press that button and get whatever default you like best for that use case.
We could do that, but you'd still need toolkit support for triggering a re-render of everything.
Try changing font size in xfce (or gnome 2, not sure about gnome 3 or kde). Every GTK app re-renders. The same happens when I manually change DPI. So where's the missing support?
Now to the actual problem: a) Xorg+randr does its best in reporting the correct DPI per display and probably should (or does) fall back to some default value when the screen dimensions are obviously incorrect. You could also add some white-list/black-list for devices that have known/unknown data despite reporting incorrect ones. I have no reason not to believe ajax that X is already doing its best in this area. It apparently also allows for overriding the data in case user knows the reported data are wrong. BTW. for my display it's reported correctly ;-)
b) DPI is physical resolution of the pixels. In theory we always either know it or user can count it (you can measure the size of your monitor or tv, you can measure the size of the area data-projector shines on,...). However it's pointless for making use of it on TVs and data-projectors -- you watch them from distance => you don't expect to read 12pt big font from several meters.
c) DEs, Toolkits and Apps. They should IMHO do it's best to make use of the data X provide, not try to override it. Allow the user to enter the correct DPI (you can let him to check physical resolution of the device, or use a physical ruler to match with virtual one or something like that). When adding another screen, DPI of the first one should be used as current toolkits don't support rendering say half of window with one DPI and another one with different one (for me it looks incredibly hard to make work actually). At the same time user should be asked whether he wants to use some kind of "magnifier" on the new screen. This setting should be remembered so next time user plugs in the same device same settings would be applied.
d) Points are real unit of measure -- 1/72 inch. So it does not makes sense if they are differently big on different screens unless explicitly zoomed in/out. If you want to set wrong DPI, don't use points for UI font sizes, it simply does not make sense. Unfortunately small, big, huge is also pointless - with high DPI even huge can be small and otherwise (or when watching from small or great distance).
Martin
On 2011/10/06 15:33 (GMT+0100) Matthew Garrett composed:
On Thu, Oct 06, 2011 at 09:30:50AM -0500, Chris Adams wrote:
I would wager that the majority of Fedora systems are single monitor (or, in the case of notebooks, single monitor much of the time); can't we at least try to correct for that case first, and _then_ try to deal with multi-monitor setups?
Changing the current behaviour doesn't make the most common case significantly better, but potentially makes a less common (but still common) case significantly worse. What's the benefit?
Nearly always the permanent display will have the higher actual DPI. This means:
1-96 makes everything undersize too often on internal displays 2-the external display has bigger text than the internal 3-accurate on the internal rarely means no text anywhere is illegible 4-it's invariably easier to make too big Xorg text smaller than the converse 5-what Nicolas Mailhot wrote
What happens to/for multiple display users should only be fussed over after an appropriate strategy is developed for the vast majority that use a single display at a time.
On 10/05/2011 12:26 PM, Adam Williamson wrote:
You just did, sorry. ;) Hardware sucks. We know this. Fedora generally takes the position that it's correct to engineer things properly and regretfully explain that the hardware sucks when this causes problems, not engineer hacks and bodges to account for broken hardware.
That's a joke, right? You can't seriously believe this.
On Wed, 2011-10-05 at 13:50 -0400, Peter Jones wrote:
On 10/05/2011 12:26 PM, Adam Williamson wrote:
You just did, sorry. ;) Hardware sucks. We know this. Fedora generally takes the position that it's correct to engineer things properly and regretfully explain that the hardware sucks when this causes problems, not engineer hacks and bodges to account for broken hardware.
That's a joke, right? You can't seriously believe this.
It's at least a *standard excuse* I've seen wheeled out when it happens to be convenient.
Let's say there's something of a cognitive dissonance effect between the cases where we say 'well, we engineered it right and your hardware sucks, sorry' and the cases where we say 'well, too much hardware sucks so we can't engineer it right for your hardware that doesn't, sorry'...
On Tue, 2011-10-04 at 13:54 -0400, Adam Jackson wrote:
On Tue, 2011-10-04 at 11:46 -0400, Kaleb S. KEITHLEY wrote:
Grovelling around in the F15 xorg-server sources and reviewing the Xorg log file on my F15 box, I see, with _modern hardware_ at least, that we do have the monitor geometry available from DDC or EDIC, and obviously it is trivial to compute the actual, correct DPI for each screen.
I am clearly going to have to explain this one more time, forever. Let's see if I can't write it authoritatively once and simply answer with a URL from here out. (As always, use of the second person "you" herein is plural, not singular.)
EDID does not reliably give you the size of the display.
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
Jon.
On Thu, Oct 06, 2011 at 11:14:56AM -0400, Jon Masters wrote:
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
The specification provides everything needed to express this data accurately, and proves the worth of specifications by virtue of approximately nobody actually implementing it correctly.
On Thu, 2011-10-06 at 16:20 +0100, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 11:14:56AM -0400, Jon Masters wrote:
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
The specification provides everything needed to express this data accurately, and proves the worth of specifications by virtue of approximately nobody actually implementing it correctly.
How about an actual DPI value?
Jon.
On Thu, Oct 06, 2011 at 11:39:16AM -0400, Jon Masters wrote:
On Thu, 2011-10-06 at 16:20 +0100, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 11:14:56AM -0400, Jon Masters wrote:
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
The specification provides everything needed to express this data accurately, and proves the worth of specifications by virtue of approximately nobody actually implementing it correctly.
How about an actual DPI value?
The DPI depends on the mode. Not all the world is an LCD, and even there it depends on whether you're native, scaling or scaling with constrained aspect ratio.
On Thu, 2011-10-06 at 16:46 +0100, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 11:39:16AM -0400, Jon Masters wrote:
On Thu, 2011-10-06 at 16:20 +0100, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 11:14:56AM -0400, Jon Masters wrote:
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
The specification provides everything needed to express this data accurately, and proves the worth of specifications by virtue of approximately nobody actually implementing it correctly.
How about an actual DPI value?
The DPI depends on the mode. Not all the world is an LCD, and even there it depends on whether you're native, scaling or scaling with constrained aspect ratio.
My main use case here is video projectors, and in that case there is no way on earth you'll ever know the DPI as it depends on the distance from the wall, and again even if you knew the distance from the wall you'd know nothing because the optimal DPI will also depend on the distance of the crowd from the wall.
So in that case you really should just give an option to the user to easily change DPI (no need to call the option 'DPIs', it can be a slider with no mention of DPI if you prefer) *if* it is needed. Chances are that a much wider font resulting from high density primary display derived DPI number combined with low resolution video projector screen will show big fonts and that will happen to be *exactly* what you want so the guy back there at the end of the room has a chance to actually read something :)
Then there will be guy X that hates the big fonts or has a ridiculously low DPI primary screen and he'll not be happy. You cannot please everyone with a default, but you can with an easy to discover option.
So whatever is the situation the slider should be right there in the tool you use to manage the additional monitors or people will be forced to search and find the menu where he can change "something" to try to get a better font size and all resulting in poor experience as that menu is normally well hidden as it is a rarely used option normally.
Simo.
On Thu, Oct 06, 2011 at 12:00:36PM -0400, Simo Sorce wrote:
So in that case you really should just give an option to the user to easily change DPI (no need to call the option 'DPIs', it can be a slider with no mention of DPI if you prefer) *if* it is needed. Chances are that a much wider font resulting from high density primary display derived DPI number combined with low resolution video projector screen will show big fonts and that will happen to be *exactly* what you want so the guy back there at the end of the room has a chance to actually read something :)
I absolutely agree that there should be an easy mechanism to globally change font size. But I don't think tying it to DPI is helpful.
On Thu, 2011-10-06 at 17:12 +0100, Matthew Garrett wrote:
On Thu, Oct 06, 2011 at 12:00:36PM -0400, Simo Sorce wrote:
So in that case you really should just give an option to the user to easily change DPI (no need to call the option 'DPIs', it can be a slider with no mention of DPI if you prefer) *if* it is needed. Chances are that a much wider font resulting from high density primary display derived DPI number combined with low resolution video projector screen will show big fonts and that will happen to be *exactly* what you want so the guy back there at the end of the room has a chance to actually read something :)
I absolutely agree that there should be an easy mechanism to globally change font size. But I don't think tying it to DPI is helpful.
Sure call it the way you prefer, does need to be DPI necessarily, but the concept of DPI would give a clue to all apps about what they are asked to do, whether it is font or some other rendering.
Simo.
On Thu, Oct 06, 2011 at 12:00:36 -0400, Simo Sorce simo@redhat.com wrote:
My main use case here is video projectors, and in that case there is no way on earth you'll ever know the DPI as it depends on the distance from the wall, and again even if you knew the distance from the wall you'd know nothing because the optimal DPI will also depend on the distance of the crowd from the wall.
What you really want to know is the resolution per angle or arc. And that would be relatively constant for a project regardless of distance from the surface it is projecting on.
On Thu, 2011-10-06 at 11:41 -0500, Bruno Wolff III wrote:
On Thu, Oct 06, 2011 at 12:00:36 -0400, Simo Sorce simo@redhat.com wrote:
My main use case here is video projectors, and in that case there is no way on earth you'll ever know the DPI as it depends on the distance from the wall, and again even if you knew the distance from the wall you'd know nothing because the optimal DPI will also depend on the distance of the crowd from the wall.
What you really want to know is the resolution per angle or arc. And that would be relatively constant for a project regardless of distance from the surface it is projecting on.
No, I do not really care because the apparent size of stuff depends on the distance of the viewer from the projected surface too, so having that value doesn't really tell you how big you should display stuff.
Also I do not need to know the resolution per arc to know that 10 pixels are always 10 pixel just bigger because the wall is farther, except they are still too little for the job because the crowd is even farther from the wall than the projector is :)
Simo.
On 2011/10/06 13:59 (GMT-0400) Simo Sorce composed:
the crowd is even farther from the wall than the projector is :)
Church sanctuary projectors are typically near the back, which means huge numbers of, if not most people, are closer to the viewing surface than the projector is.
On 10/6/2011 12:41 PM, Bruno Wolff III wrote:
On Thu, Oct 06, 2011 at 12:00:36 -0400, Simo Sorcesimo@redhat.com wrote:
My main use case here is video projectors, and in that case there is no way on earth you'll ever know the DPI as it depends on the distance from the wall, and again even if you knew the distance from the wall you'd know nothing because the optimal DPI will also depend on the distance of the crowd from the wall.
What you really want to know is the resolution per angle or arc. And that would be relatively constant for a project regardless of distance from the surface it is projecting on.
What the ordinary user would like would be for Fedora to set the correct resolution on the damn monitor. or at least offer a choice other than *really crappy* or *even more crappy.*
I can think of several other Linux Distributions that can do this during *the install*. No barriers to leap. No doors to open. No oceans to swim. No mountains to climb. It 'just works'.
No distro names for politeness. Ask I and will post names. I can think of three that I know for a fact that do this.
As good as it is there are things at which Fedora flat out sucks!
On Thu, 2011-10-06 at 12:00 -0400, Simo Sorce wrote:
So in that case you really should just give an option to the user to easily change DPI (no need to call the option 'DPIs', it can be a slider with no mention of DPI if you prefer) *if* it is needed.
There actually is one, only it's called the Text Scaling Factor, and it's hidden in the Accessibility panel. It effectively sets the DPI to 96*TSF, so you have to do desired dpi / 96 to figure out what you want to set it to.
(That's a much better UI! Now, both people who know what DPI is and how to calculate it and people who don't are lost and confused. equality ho! )
Simo Sorce (simo@redhat.com) said:
My main use case here is video projectors, and in that case there is no way on earth you'll ever know the DPI as it depends on the distance from the wall, and again even if you knew the distance from the wall you'd know nothing because the optimal DPI will also depend on the distance of the crowd from the wall.
Obviously you embed radar in every projector.
Bill
On Thu, Oct 6, 2011 at 10:01 AM, Bill Nottingham notting@redhat.com wrote:
Obviously you embed radar in every projector.
Quite possible to do with existing off the shelf ultrasonic or diode laser telemetry being used for DYI robotic range finding. In fact you can get ones that use i2c for data acquisition.I could probably mock something up with an arduino unit to do the measurement. Though I'd need some help getting a projector to use the calculation and hand it back over the wire to the computer.
-jef"sort of kidding...sort of"spaleta
Once upon a time, Bill Nottingham notting@redhat.com said:
Obviously you embed radar in every projector.
Projectors with auto-focus already detect the distance to the screen (I think they use IR). I don't expect that they change the EDID screen size reporting though.
On Thu, 2011-10-06 at 13:36 -0500, Chris Adams wrote:
Once upon a time, Bill Nottingham notting@redhat.com said:
Obviously you embed radar in every projector.
Projectors with auto-focus already detect the distance to the screen (I think they use IR). I don't expect that they change the EDID screen size reporting though.
But you don't (only) need to know the distance between the projector and the screen, you need to know the distance between the *audience* and the screen.
So, logically, what we need to do is make projectors capable of using Bluetooth to figure out what cellphones are nearby, and Facebook and Foursquare to check on their GPS locations...
On Thu, Oct 06, 2011 at 12:00:26PM -0700, Adam Williamson wrote:
On Thu, 2011-10-06 at 13:36 -0500, Chris Adams wrote:
Once upon a time, Bill Nottingham notting@redhat.com said:
Obviously you embed radar in every projector.
Projectors with auto-focus already detect the distance to the screen (I think they use IR). I don't expect that they change the EDID screen size reporting though.
But you don't (only) need to know the distance between the projector and the screen, you need to know the distance between the *audience* and the screen.
So, logically, what we need to do is make projectors capable of using Bluetooth to figure out what cellphones are nearby, and Facebook and Foursquare to check on their GPS locations...
Every time DPI discussion begins, sooner or later projectors and points-per-arc are appear. They are like Godwin Law for DPI Discussions. I think everybody agrees that true DPI cannot be found. But true DPI is only needed if we want PERFECT setup. Perfect isn't possible. Can we just give up on perfect setup and instead just go with GOOD ENOUGH?
Good Enough is getting real DPI for main screen and not really caring about secondary screens. Just read DPI from laptop panel (or first detected output on desktops) and use it instead of 96. Let's not care about not-so-common case of wildly different DPI on connected screens. (hey, my 300 DPI cellphone has HDMI output. I can connect it to 70" TV which gives 32 DPI. Let's don't even try to accomodate real DPI of TV!)
On Thu, 2011-10-06 at 11:14 -0400, Jon Masters wrote:
On Tue, 2011-10-04 at 13:54 -0400, Adam Jackson wrote:
EDID does not reliably give you the size of the display.
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
Given that successive revisions of the spec have gone out of their way to make it acceptable for displays to provide _less_ useful information, on the grounds of manufacturing cost reduction, I think the momentum is quite in the other direction.
More pragmatically, VESA are not the people with any influence here. The only thing that matters to a monitor vendor is what Windows does when you plug it in. Linux can stamp its little foot all it wants. No one will care. If you want to be a big enough player in that market to have some influence, you have to start by playing in the sandbox that's already built, and in that sandbox physical dimensions are just not reliable and never will be.
Cope.
- ajax
On Thu, 2011-10-06 at 12:12 -0400, Adam Jackson wrote:
On Thu, 2011-10-06 at 11:14 -0400, Jon Masters wrote:
On Tue, 2011-10-04 at 13:54 -0400, Adam Jackson wrote:
EDID does not reliably give you the size of the display.
How about "EDID as it exists today". Since you're able to so beautifully explain what the pitfalls are, I'd assume you've raised this with the VESA and asked that they revisit this in the future to accurately provide DPI information that Operating Systems can rely on?
Given that successive revisions of the spec have gone out of their way to make it acceptable for displays to provide _less_ useful information, on the grounds of manufacturing cost reduction, I think the momentum is quite in the other direction.
More pragmatically, VESA are not the people with any influence here. The only thing that matters to a monitor vendor is what Windows does when you plug it in. Linux can stamp its little foot all it wants. No one will care. If you want to be a big enough player in that market to have some influence, you have to start by playing in the sandbox that's already built, and in that sandbox physical dimensions are just not reliable and never will be.
Cope.
Ok. I can cope, and not to flog a dead horse here...but has any effort been made anywhere on the Open Source side of things to influence future EDID specs? I'm sure Linux can stamp all it wants and nobody will care, but it probably doesn't hurt to raise this for discussion next time there's an update to the standard - or, shock, reach out to MSFT and see if they have any interest in working together on fixing this experience which perhaps also causes problems they care about on Windows.
Just a suggestion.
Jon.
On Mon, Oct 03, 2011 at 04:48:11PM +0100, Camilo Mesias wrote:
Hi,
A daft question perhaps, but I thought...
I'm not sure how we can make DPI magically be correct in gazillions of broken displays' EDID.
How do other OS' do it?
Apple manage by virtue of most of their customers using their monitors. Windows doesn't appear to be DPI-sensitive.
On Mon, 2011-10-03 at 17:27 +0100, Matthew Garrett wrote:
On Mon, Oct 03, 2011 at 04:48:11PM +0100, Camilo Mesias wrote:
Hi,
A daft question perhaps, but I thought...
I'm not sure how we can make DPI magically be correct in gazillions of broken displays' EDID.
How do other OS' do it?
Apple manage by virtue of most of their customers using their monitors. Windows doesn't appear to be DPI-sensitive.
There's a fairly well-hidden setting in Windows' display config settings somewhere (depends on the exact version of Windows in use) which lets you pick between three hard-coded DPI settings (96, 120 or 150 or so, IIRC) or - again, I think it depends on the Windows version in use - specify a DPI value.
I don't believe Windows ever pays attention to the display's EDID-reported DPI.
As another poster in this thread has noted, this (Windows' hard-coded 96dpi default) has probably had a very significant impact on the market availability of displays with DPIs significantly above 96. You cannot, for instance, buy a 20" 1920x1080 monitor on the mass market, though there's no plausible technical reason why not. Another data point is that, when the Sony Vaio P (which has a 221 DPI display) came out, most of the reviews complained that the fonts in Windows were far too small...
On Fri, 2011-09-30 at 11:00 +0100, Daniel Drake wrote:
Hi,
I'm working on bringing OLPC up-to-date with all the great efforts with GNOME 3, systemd, etc.
On the OLPC XO laptops we have quite a strange screen - it is small (152mm x 114mm) but very high resolution (1200x900 i.e. 201 dots per inch).
Previously, on Fedora 14, we had to adjust the default GNOME font sizes since they didn't look right on the screen (I think they were too big). Now I'm looking at applying the same set of customisations to Fedora 16 since the default fonts are uncomfortably small on our display.
We currently hard-code the DPI: http://git.gnome.org/browse/gnome-settings-daemon/tree/plugins/xsettings/gsd...
But you can work-around the problem by using text-scaling-factor GSettings, which would then be "201 / 96".
HTH
devel@lists.stg.fedoraproject.org