Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
There are several good reasons for this, rather then repeating them I'm just going to point to the Debian announcement:
https://tjaalton.wordpress.com/2016/07/23/intel-graphics-gen4-and-newer-now-...
This mail is to let all Fedora users know that starting with Fedora-26 / rawhide as of today, we are making the same change.
Note that the xorg-x11-drv-intel package has already been carrying a Fedora patch to not bind to the GPU on Skylake or newer, even before Debian announced this, this just makes the same change for older Intel GPUs.
For people who are using the now default GNOME3 on Wayland session, nothing changes, since Xwayland always uses glamor for X acceleration, just like the modesetting driver.
If you encounter any issues causes by this change, please file a bug in bugzilla.
Regards,
Hans
On 01/10/2017 08:22 AM, Hans de Goede wrote:
If you encounter any issues causes by this change, please file a bug in bugzilla.
Are performance regressions covered under this clause?
Iris 5100 (Haswell) gtkperf - Intel = ~29 seconds gtkperf - Modeset = ~35 seconds
Fairly significant change.
Thanks, Michael
Hi,
On 01/10/2017 06:59 PM, Michael Cronenworth wrote:
On 01/10/2017 08:22 AM, Hans de Goede wrote:
If you encounter any issues causes by this change, please file a bug in bugzilla.
Are performance regressions covered under this clause?
User visible changes, (e.g. some program slowing to a few fps, this has happened in some cases) yes.
Micro benchmark results, no.
Regards,
Hans
Iris 5100 (Haswell) gtkperf - Intel = ~29 seconds gtkperf - Modeset = ~35 seconds
Fairly significant change.
Thanks, Michael _______________________________________________ devel mailing list -- devel@lists.fedoraproject.org To unsubscribe send an email to devel-leave@lists.fedoraproject.org
On Tue, 2017-01-10 at 11:59 -0600, Michael Cronenworth wrote:
Are performance regressions covered under this clause?
Iris 5100 (Haswell) gtkperf - Intel = ~29 seconds gtkperf - Modeset = ~35 seconds
Fairly significant change.
On a benchmark that doesn't reflect real usage very well, but sure. Can you drill down on this a bit? Which subtests get most worse?
- ajax
On 01/11/2017 10:52 AM, Adam Jackson wrote:
On a benchmark that doesn't reflect real usage very well, but sure. Can you drill down on this a bit? Which subtests get most worse?
Unfortunately the state of benchmarking in Linux is extremely poor. I'm not here to discuss the non-existent ecosystem of performance testing, but I appreciate that you are willing to look at the test results.
I have compiled the results into a spreadsheet. There is not one sub-test that performs worse than another. The overall times are slower with modesetting.
https://docs.google.com/spreadsheets/d/1YWC0C8rxIpGt185iTSjs2cC1vlaFg412JMgZ...
If I could also reproduce this performance delta in other reproducible and measurable means I would gladly do so. The reduction of performance will have negative impact on battery life, which is important to me on the platform I am running the tests on: a laptop.
On Fri, Jan 13, 2017 at 09:51:32AM -0600, Michael Cronenworth wrote:
Unfortunately the state of benchmarking in Linux is extremely poor. I'm not here to discuss the non-existent ecosystem of performance testing, but I appreciate that you are willing to look at the test results.
Is the stuff from Phoronix's openbenchmarking.org helpful at all? http://openbenchmarking.org/suites/pts
On 01/13/2017 10:19 AM, Matthew Miller wrote:
Is the stuff from Phoronix's openbenchmarking.org helpful at all? http://openbenchmarking.org/suites/pts
It does not implement any benchmarks itself and relies on the same tests that are not accepted by most people you show the results. It only provides a platform to run the tests and spit out the results in a pretty way.
On 01/13/2017 10:51 AM, Michael Cronenworth wrote:
On 01/11/2017 10:52 AM, Adam Jackson wrote:
On a benchmark that doesn't reflect real usage very well, but sure. Can you drill down on this a bit? Which subtests get most worse?
I have compiled the results into a spreadsheet. There is not one sub-test that performs worse than another. The overall times are slower with modesetting.
https://docs.google.com/spreadsheets/d/1YWC0C8rxIpGt185iTSjs2cC1vlaFg412JMgZ...
TL;DR: DrawingArea Circles seems to be the worst offender: it runs at 40% of Intel speed and is used relatively often in this test; other significant slowdowns are around 83% in ComboBox* The worst offender is DrawingArea Pixbuf running at 25% of Intel, but not very significant in this test because it runs for very short time.
On Tuesday, 10 January 2017 at 18:59, Michael Cronenworth wrote:
On 01/10/2017 08:22 AM, Hans de Goede wrote:
If you encounter any issues causes by this change, please file a bug in bugzilla.
Are performance regressions covered under this clause?
Iris 5100 (Haswell) gtkperf - Intel = ~29 seconds gtkperf - Modeset = ~35 seconds
Fairly significant change.
Which package contains this benchmark? I can't find it in Fedora repos.
Regards, Dominik
On Thu, Jan 12, 2017 at 01:42:23PM +0100, Dominik 'Rathann' Mierzejewski wrote:
On Tuesday, 10 January 2017 at 18:59, Michael Cronenworth wrote:
On 01/10/2017 08:22 AM, Hans de Goede wrote:
If you encounter any issues causes by this change, please file a bug in bugzilla.
Are performance regressions covered under this clause?
Iris 5100 (Haswell) gtkperf - Intel = ~29 seconds gtkperf - Modeset = ~35 seconds
Fairly significant change.
Which package contains this benchmark? I can't find it in Fedora repos.
C.f. http://www.phoronix.com/scan.php?page=article&item=intel-modesetting-201...
Zbyszek
On 01/12/2017 06:42 AM, Dominik 'Rathann' Mierzejewski wrote:
Which package contains this benchmark? I can't find it in Fedora repos.
GtkPerf was retired in Fedora a few versions ago. I should revive it and update it to use Gtk3 and include "real-world" cases, but my time is limited. :(
The last build can still be found on Koji.
https://koji.fedoraproject.org/koji/buildinfo?buildID=650482
Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
Hello,
Is it possible to configure xserver to use "intel" driver without recompiling it?
Best regards, Samuel
Regards,
Hans
Hi,
On 11-01-17 12:15, Samuel Rakitničan wrote:
Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
Hello,
Is it possible to configure xserver to use "intel" driver without recompiling it?
Yes, we're just changing the default, if you drop a 99-local.conf file in /etc/X11/xorg.conf.d with the following contents:
Section "OutputClass" Identifier "intel" MatchDriver "i915" Driver "intel" EndSection
Then you should get the intel driver used for any GPUs using the i915 kernel driver.
Regards,
Hans
On Wednesday, 11 January 2017 at 14:24, Hans de Goede wrote:
Hi,
On 11-01-17 12:15, Samuel Rakitničan wrote:
Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
Hello,
Is it possible to configure xserver to use "intel" driver without recompiling it?
Yes, we're just changing the default, if you drop a 99-local.conf file in /etc/X11/xorg.conf.d with the following contents:
Section "OutputClass" Identifier "intel" MatchDriver "i915" Driver "intel" EndSection
Then you should get the intel driver used for any GPUs using the i915 kernel driver.
How does one do the opposite (i.e. switch to the modesetting driver) on, say, Fedora 25 running on Haswell? I'd like to test if that works around a particularly annoying system freeze bug (https://bugs.freedesktop.org/show_bug.cgi?id=98760).
Regards, Dominik
On Tue, 2017-01-10 at 15:22 +0100, Hans de Goede wrote:
Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
There are several good reasons for this, rather then repeating them I'm just going to point to the Debian announcement:
https://tjaalton.wordpress.com/2016/07/23/intel-graphics-gen4-and-new er-now-defaults-to-modesetting-driver-on-x/
This mail is to let all Fedora users know that starting with Fedora-26 / rawhide as of today, we are making the same change.
Note that the xorg-x11-drv-intel package has already been carrying a Fedora patch to not bind to the GPU on Skylake or newer, even before Debian announced this, this just makes the same change for older Intel GPUs.
For people who are using the now default GNOME3 on Wayland session, nothing changes, since Xwayland always uses glamor for X acceleration, just like the modesetting driver.
If you encounter any issues causes by this change, please file a bug in bugzilla.
Regards,
Hans _______________________________________________
I just want to mention that this change has been pushed (merged) to f25 branch as well (which is not planed I guess), I filled bug #1413251 [1]
On Sat, 2017-01-14 at 08:20 +0100, Branko Grubic wrote:
I just want to mention that this change has been pushed (merged) to f25 branch as well (which is not planed I guess), I filled bug #1413251 [1]
D'oh, my bad. New update in testing shortly.
- ajax
On my ThinkPad X220 Tablet running Fedora 25 I cannot change the display brightness of the internal screen using `xbacklight`. It fails with
No outputs have backlight property
The combination FN+PgDn that worked in KDE no longer works. It seems that I have no control over the backlight any more. Any way to get it back under control?
On Sun, 15 Jan 2017 18:58:43 +0100 Martin Ueding lists@martin-ueding.de wrote:
On my ThinkPad X220 Tablet running Fedora 25 I cannot change the display brightness of the internal screen using `xbacklight`. It fails with
No outputs have backlight property
The combination FN+PgDn that worked in KDE no longer works. It seems that I have no control over the backlight any more. Any way to get it back under control?
Likely you are logged into a wayland session? In that case xbacklight won't work. You could choose a Gnome on X11 session.
Gnome should let you adjust the brightness however... does that slider not work?
kevin
Am 15.01.2017 um 20:23 schrieb Kevin Fenzi:
Likely you are logged into a wayland session? In that case xbacklight won't work. You could choose a Gnome on X11 session.
I use Awesome WM which uses X.
Gnome should let you adjust the brightness however... does that slider not work?
I'll have to see whether I can use the GNOME session. Last time I tried, neither variant let me log in.
On Sun, 15 Jan 2017 20:29:14 +0100 Martin Ueding lists@martin-ueding.de wrote:
Am 15.01.2017 um 20:23 schrieb Kevin Fenzi:
Likely you are logged into a wayland session? In that case xbacklight won't work. You could choose a Gnome on X11 session.
I use Awesome WM which uses X.
Ah, then I have no idea what could be going on. Did xbacklight work in the past?
kevin
Am 16.01.2017 um 17:18 schrieb Kevin Fenzi:
Did xbacklight work in the past?
Yes, it did.
To work around that bug, I use KDE Plasma and there I can change the brightness with Fn+PgUp. `xbacklight` does not work there, saying that no output have backlight property.
Hi,
On 19-01-17 16:14, Martin Ueding wrote:
Am 16.01.2017 um 17:18 schrieb Kevin Fenzi:
Did xbacklight work in the past?
Yes, it did.
To work around that bug, I use KDE Plasma and there I can change the brightness with Fn+PgUp. `xbacklight` does not work there, saying that no output have backlight property.
Yes that is an expected result of switching to the modesetting driver, the intel ddx is the only driver which ever implemented a xrandr backlight property, non of the others (ati, nouveau) have ever offered this, so most tools simply write directly to /sys/class/backlight, but xbacklight relies on the xrandr property (and is the only tool do so AFAICT).
Maybe someone should look into making xbacklight access the sysfs interface directly like gnome, kde and xfce are doing. This does require root rights, a patch for this could re-use the
/usr/libexec/xf86-video-intel-backlight-helper
See xf86-video-intel/src/backlight.c for how to use this.
Regards,
Hans
Hans de Goede wrote:
most tools simply write directly to /sys/class/backlight, but xbacklight relies on the xrandr property (and is the only tool do so AFAICT).
KDE's PowerDevil supports both and prefers XRandR where supported: https://cgit.kde.org/powerdevil.git/tree/daemon/backends/upower
As you pointed out, the nice thing about the XRandR property is that it does not require root access, whereas the sysfs interface requires going through a KAuth/PolicyKit helper to get root (which PowerDevil sets up with a default policy of "yes" so that all users can use it without a PolicyKit password prompt). It is sad that most drivers did not bother implementing it.
Kevin Kofler
Hi,
On 20-01-17 02:05, Kevin Kofler wrote:
Hans de Goede wrote:
most tools simply write directly to /sys/class/backlight, but xbacklight relies on the xrandr property (and is the only tool do so AFAICT).
KDE's PowerDevil supports both and prefers XRandR where supported: https://cgit.kde.org/powerdevil.git/tree/daemon/backends/upower
As you pointed out, the nice thing about the XRandR property is that it does not require root access, whereas the sysfs interface requires going through a KAuth/PolicyKit helper to get root (which PowerDevil sets up with a default policy of "yes" so that all users can use it without a PolicyKit password prompt). It is sad that most drivers did not bother implementing it.
Actually since the xserver no longer runs as root now a days, the xf86-video-* drivers need to go through the same hoops, which is why xorg-x11-drv-intel has: /usr/libexec/xf86-video-intel-backlight-helper which it starts through pkexec ...
Note that systemd already has some backlight handling for save / restore of backlight settings over a reboot. I believe the real solution here might be to have a systemd-backlightd or some such, rather then have all DEs invent their own wheel.
Regards,
Hans
On Ter, 2017-01-10 at 15:22 +0100, Hans de Goede wrote:
Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
There are several good reasons for this, rather then repeating them I'm just going to point to the Debian announcement:
https://tjaalton.wordpress.com/2016/07/23/intel-graphics-gen4-and-new er-now-defaults-to-modesetting-driver-on-x/
This mail is to let all Fedora users know that starting with Fedora-26 / rawhide as of today, we are making the same change.
Note that the xorg-x11-drv-intel package has already been carrying a Fedora patch to not bind to the GPU on Skylake or newer, even before Debian announced this, this just makes the same change for older Intel GPUs.
For people who are using the now default GNOME3 on Wayland session, nothing changes, since Xwayland always uses glamor for X acceleration, just like the modesetting driver.
If you encounter any issues causes by this change, please file a bug in bugzilla.
The default of modesetting is enable glamor and glamor doesn't run on 32-bit archs
[ 42.108] (WW) glamor requires at least 128 instructions (64 reported)
I used modesetting on F25 with Option "AccelMethod" "none" and worked very well, Intel drive crash when using pipelight and with modesetting the crash don't happens, but I need to use a no-default option ...
Cheers,
Hi,
On 22-02-17 03:42, Sérgio Basto wrote:
On Ter, 2017-01-10 at 15:22 +0100, Hans de Goede wrote:
Hi,
A while back Debian has switched to using the modesetting Xorg driver rather then the intel Xorg driver for Intel GPUs.
There are several good reasons for this, rather then repeating them I'm just going to point to the Debian announcement:
https://tjaalton.wordpress.com/2016/07/23/intel-graphics-gen4-and-new er-now-defaults-to-modesetting-driver-on-x/
This mail is to let all Fedora users know that starting with Fedora-26 / rawhide as of today, we are making the same change.
Note that the xorg-x11-drv-intel package has already been carrying a Fedora patch to not bind to the GPU on Skylake or newer, even before Debian announced this, this just makes the same change for older Intel GPUs.
For people who are using the now default GNOME3 on Wayland session, nothing changes, since Xwayland always uses glamor for X acceleration, just like the modesetting driver.
If you encounter any issues causes by this change, please file a bug in bugzilla.
The default of modesetting is enable glamor and glamor doesn't run on 32-bit archs
What makes you say that glamor does not run on 32 bit archs ?
Regards,
Hans
On Wed, 2017-02-22 at 02:42 +0000, Sérgio Basto wrote:
The default of modesetting is enable glamor
Correct.
and glamor doesn't run on 32-bit archs
Incorrect. Glamor works fine on 32-bit CPUs, and on 64-bit CPUs if you force them to run 32-bit binaries. What it doesn't work on is some of the GPUs that happen to be commonly attached to 32-bit CPUs. Which is what this:
[ 42.108] (WW) glamor requires at least 128 instructions (64 reported)
... is trying to say. The "gen3" family of Intel GPUs (i915, i945, G33) are (to put it politely) garbage. Though they claim to support fragment shaders, the instruction limit of those shaders is far less than what glamor requires.
We knew this, though, which is why our (actually Debian's) patch to the X server to default to modesetting on intel only does so for gen4 and newer:
http://pkgs.fedoraproject.org/cgit/rpms/xorg-x11-server.git/tree/06_use-inte...
This way gen2 and gen3 still get native 2D and 3D acceleration.
I used modesetting on F25 with Option "AccelMethod" "none" and worked very well, Intel drive crash when using pipelight and with modesetting the crash don't happens, but I need to use a no-default option ...
That's just a bug in the intel driver, then. Can you be more specific?
- ajax
On Qua, 2017-02-22 at 15:20 -0500, Adam Jackson wrote:
On Wed, 2017-02-22 at 02:42 +0000, Sérgio Basto wrote:
The default of modesetting is enable glamor
Correct.
and glamor doesn't run on 32-bit archs
Incorrect. Glamor works fine on 32-bit CPUs, and on 64-bit CPUs if you force them to run 32-bit binaries. What it doesn't work on is some of the GPUs that happen to be commonly attached to 32-bit CPUs. Which is what this:
[ 42.108] (WW) glamor requires at least 128 instructions (64 reported)
... is trying to say. The "gen3" family of Intel GPUs (i915, i945, G33) are (to put it politely) garbage. Though they claim to support fragment shaders, the instruction limit of those shaders is far less than what glamor requires.
ah ok , so is not 32-bits problem .
We knew this, though, which is why our (actually Debian's) patch to the X server to default to modesetting on intel only does so for gen4 and newer:
http://pkgs.fedoraproject.org/cgit/rpms/xorg-x11-server.git/tree/06_u se-intel-only-on-pre-gen4.diff
This way gen2 and gen3 still get native 2D and 3D acceleration.
I used modesetting on F25 with Option "AccelMethod" "none" and worked very well, Intel drive crash when using pipelight and with modesetting the crash don't happens, but I need to use a no-default option ...
That's just a bug in the intel driver, then. Can you be more specific?
My i915 works better with modesetting drive, under a complex silverlight emulation with wine and pipelight in Firefox, but I have to disable glamor to boot, with the default Intel drive under F25 it crash after playing 2 or 3 minutes , I send one backtrace in attach.
In conclusion modesetting also works better in old graphics card at least in this particular case (and I'm very happy to have silverlight emulation working)
Thanks,
... is trying to say. The "gen3" family of Intel GPUs (i915, i945, G33) are (to put it politely) garbage. Though they claim to support fragment shaders, the instruction limit of those shaders is far less than what glamor requires.
These GPUs will stop claiming fragment shaders support from Mesa 17.1.[0], just FYI
[0] https://cgit.freedesktop.org/mesa/mesa/commit/?id=a1891da7c865c80d95c450abfc...
On Wed, 2017-02-22 at 22:51 +0000, František Zatloukal wrote:
... is trying to say. The "gen3" family of Intel GPUs (i915, i945, G33) are (to put it politely) garbage. Though they claim to support fragment shaders, the instruction limit of those shaders is far less than what glamor requires.
These GPUs will stop claiming fragment shaders support from Mesa 17.1.[0], just FYI
Oh good! That's for the best, really.
glamor also has a gles2 path, but right now it's known to be broken. We'd like to fix that in general since some arm chips are never going to be able to do enough desktop GL to satisfy glamor. When/if that happens we can revisit glamor on gen3.
- ajax
devel@lists.stg.fedoraproject.org