* UDL device cannot get its own screen
@ 2019-10-22 15:50 Böszörményi Zoltán
2019-10-22 20:57 ` Ilia Mirkin
2019-10-23 7:42 ` Pekka Paalanen
0 siblings, 2 replies; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-10-22 15:50 UTC (permalink / raw)
To: xorg, Maling list - DRI developers
Hi,
I have the below configuration for an Intel based POS system that,
while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
only two are usable. DP1 for the built-in touchscreen and VGA1 for
the external VGA connector.
I wanted to use an USB DisplayLink device as the 3rd output, with all
three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.
The first observation is that I can't seem to be able to use the Intel
DDX driver in conjunction with the modesetting DDX that the UDL device
uses. The symptom is that two modesetting outputs are initialized,
one for UDL and one for the disconnected HDMI1 Intel output. At least
now the X server don't crash as with Xorg 1.19.x with a similar attempt.
The second is that when the modesetting driver is used, the Intel outputs
are renamed from VGA1 to VGA-1 and so on, i.e. the outputs get an extra
"-" between the output type and the number so it needed extra typing
to port the original config from intel to modesetting.
The third observation is that while I am using this configuration below,
so the UDL device should be assigned to :0.2 (and active!), it is really
assigned to :0[.0] as an inactive output. See that there's no "*" indicator
set for any of the supported modes on DVI-I-1-1.
How can I set up 3 different Screens correctly for 3 separate fullscreen
applications?
I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
patch from Dave Airlie that at least wakes up the UDL device and makes
it visible without extra magic with providers/sinks.
# DISPLAY=:0 xrandr
Screen 0: minimum 320 x 200, current 1024 x 768, maximum 8192 x 8192
DP-1 connected primary 1024x768+0+0 (normal left inverted right x axis y axis) 304mm x 228mm
1024x768 60.00*+
DVI-I-1-1 connected (normal left inverted right x axis y axis)
1024x768 75.03 + 60.00
1920x1080 60.00 +
1680x1050 59.88
1280x1024 75.02 60.02
1440x900 74.98 59.90
1280x720 60.00
800x600 75.00 60.32
640x480 75.00 72.81 66.67 59.94
720x400 70.08
1024x768 (0x4a) 65.000MHz -HSync -VSync
h: width 1024 start 1048 end 1184 total 1344 skew 0 clock 48.36KHz
v: height 768 start 771 end 777 total 806 clock 60.00Hz
# cat /etc/X11/xorg.conf.d/videocard.conf
Section "Monitor"
Identifier "Monitor-DP-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Monitor"
Identifier "Monitor-HDMI-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Monitor"
Identifier "Monitor-VGA-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Monitor"
Identifier "DVI-I-1-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Device"
Identifier "Intel0"
Driver "modesetting"
Option "kmsdev" "/dev/dri/card1"
Screen 0
Option "Monitor-DP1" "DP-1"
Option "ZaphodHeads" "DP-1"
EndSection
Section "Device"
Identifier "Intel1"
Driver "modesetting"
Option "kmsdev" "/dev/dri/card1"
Screen 1
Option "Monitor-VGA-1" "VGA-1"
Option "ZaphodHeads" "VGA-1"
EndSection
# Intentionally not referenced in ServerLayout below
Section "Device"
Identifier "Intel2"
Driver "modesetting"
Option "kmsdev" "/dev/dri/card1"
Option "Monitor-HDMI-1" "HDMI-1"
Option "ZaphodHeads" "HDMI-1"
EndSection
Section "Device"
Identifier "UDL"
Driver "modesetting"
Option "kmsdev" "/dev/dri/card0"
Screen 2
Option "Monitor-DVI-I-1-1" "DVI-I-1-1"
EndSection
Section "Screen"
Identifier "SCREEN"
Option "AutoServerLayout" "on"
Device "Intel0"
Monitor "Monitor-DP-1"
SubSection "Display"
Modes "1024x768"
Depth 24
EndSubSection
EndSection
Section "Screen"
Identifier "SCREEN1"
Option "AutoServerLayout" "on"
Device "Intel1"
Monitor "Monitor-VGA-1"
SubSection "Display"
Modes "1024x768"
Depth 24
EndSubSection
EndSection
Section "Screen"
Identifier "SCREEN2"
Option "AutoServerLayout" "on"
Device "UDL"
Monitor "Monitor-DVI-I-1-1"
SubSection "Display"
Modes "1024x768"
Depth 24
EndSubSection
EndSection
Section "ServerLayout"
Identifier "LAYOUT"
Option "AutoServerLayout" "on"
Screen 0 "SCREEN"
Screen 1 "SCREEN1" RightOf "SCREEN"
Screen 2 "SCREEN2" RightOf "SCREEN1"
EndSection
Best regards,
Zoltán Böszörményi
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-22 15:50 UDL device cannot get its own screen Böszörményi Zoltán
@ 2019-10-22 20:57 ` Ilia Mirkin
2019-10-23 6:41 ` Böszörményi Zoltán
2019-10-23 7:42 ` Pekka Paalanen
1 sibling, 1 reply; 14+ messages in thread
From: Ilia Mirkin @ 2019-10-22 20:57 UTC (permalink / raw)
To: Böszörményi Zoltán; +Cc: xorg, Maling list - DRI developers
On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>
> Hi,
>
> I have the below configuration for an Intel based POS system that,
> while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
> only two are usable. DP1 for the built-in touchscreen and VGA1 for
> the external VGA connector.
>
> I wanted to use an USB DisplayLink device as the 3rd output, with all
> three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.
>
> [...]
>
> How can I set up 3 different Screens correctly for 3 separate fullscreen
> applications?
>
> I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
> patch from Dave Airlie that at least wakes up the UDL device and makes
> it visible without extra magic with providers/sinks.
If it's being treated as a GPU, that's your first problem for this
kind of setup. You should see modeset(2), in your logs, but I suspect
you're seeing modeset(G0) (the "G" indicates "GPU").
>
> # cat /etc/X11/xorg.conf.d/videocard.conf
> Section "Monitor"
> Identifier "Monitor-DP-1"
> Option "AutoServerLayout" "on"
> Option "Rotate" "normal"
> EndSection
>
> Section "Monitor"
> Identifier "Monitor-HDMI-1"
> Option "AutoServerLayout" "on"
> Option "Rotate" "normal"
> EndSection
>
> Section "Monitor"
> Identifier "Monitor-VGA-1"
> Option "AutoServerLayout" "on"
> Option "Rotate" "normal"
> EndSection
>
> Section "Monitor"
> Identifier "DVI-I-1-1"
The others are Monitor-*, this one isn't. You probably want this to be
DVI-I-1, as noted below. I guess you get the extra -1 from seeing it
as a slaved GPU's output in your current configuration.
> Option "AutoServerLayout" "on"
> Option "Rotate" "normal"
> EndSection
>
> Section "Device"
> Identifier "Intel0"
> Driver "modesetting"
> Option "kmsdev" "/dev/dri/card1"
> Screen 0
> Option "Monitor-DP1" "DP-1"
> Option "ZaphodHeads" "DP-1"
> EndSection
>
> Section "Device"
> Identifier "Intel1"
> Driver "modesetting"
> Option "kmsdev" "/dev/dri/card1"
> Screen 1
> Option "Monitor-VGA-1" "VGA-1"
> Option "ZaphodHeads" "VGA-1"
> EndSection
>
> # Intentionally not referenced in ServerLayout below
> Section "Device"
> Identifier "Intel2"
> Driver "modesetting"
> Option "kmsdev" "/dev/dri/card1"
> Option "Monitor-HDMI-1" "HDMI-1"
> Option "ZaphodHeads" "HDMI-1"
> EndSection
>
> Section "Device"
> Identifier "UDL"
> Driver "modesetting"
> Option "kmsdev" "/dev/dri/card0"
> Screen 2
> Option "Monitor-DVI-I-1-1" "DVI-I-1-1"
I think you have an extra -1 in here (and the monitor name doesn't
exist as per above). And I think the "Screen" index is wrong -- it's
not what one tends to think it is, as I recall. I think you can just
drop these lines though.
> EndSection
>
> Section "Screen"
> Identifier "SCREEN"
> Option "AutoServerLayout" "on"
> Device "Intel0"
> Monitor "Monitor-DP-1"
> SubSection "Display"
> Modes "1024x768"
> Depth 24
> EndSubSection
> EndSection
>
> Section "Screen"
> Identifier "SCREEN1"
> Option "AutoServerLayout" "on"
> Device "Intel1"
> Monitor "Monitor-VGA-1"
> SubSection "Display"
> Modes "1024x768"
> Depth 24
> EndSubSection
> EndSection
>
> Section "Screen"
> Identifier "SCREEN2"
> Option "AutoServerLayout" "on"
> Device "UDL"
> Monitor "Monitor-DVI-I-1-1"
> SubSection "Display"
> Modes "1024x768"
> Depth 24
> EndSubSection
> EndSection
>
> Section "ServerLayout"
> Identifier "LAYOUT"
> Option "AutoServerLayout" "on"
> Screen 0 "SCREEN"
> Screen 1 "SCREEN1" RightOf "SCREEN"
> Screen 2 "SCREEN2" RightOf "SCREEN1"
> EndSection
>
> Best regards,
> Zoltán Böszörményi
> _______________________________________________
> dri-devel mailing list
> dri-devel@lists.freedesktop.org
> https://lists.freedesktop.org/mailman/listinfo/dri-devel
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-22 20:57 ` Ilia Mirkin
@ 2019-10-23 6:41 ` Böszörményi Zoltán
2019-10-23 13:32 ` Ilia Mirkin
0 siblings, 1 reply; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-10-23 6:41 UTC (permalink / raw)
To: Ilia Mirkin; +Cc: xorg, Maling list - DRI developers
2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:
> On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>
>> Hi,
>>
>> I have the below configuration for an Intel based POS system that,
>> while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
>> only two are usable. DP1 for the built-in touchscreen and VGA1 for
>> the external VGA connector.
>>
>> I wanted to use an USB DisplayLink device as the 3rd output, with all
>> three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.
>>
>> [...]
>>
>> How can I set up 3 different Screens correctly for 3 separate fullscreen
>> applications?
>>
>> I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
>> patch from Dave Airlie that at least wakes up the UDL device and makes
>> it visible without extra magic with providers/sinks.
>
> If it's being treated as a GPU, that's your first problem for this
> kind of setup. You should see modeset(2), in your logs, but I suspect
> you're seeing modeset(G0) (the "G" indicates "GPU").
modeset(2) is the unconnected HDMI-1 output advertised by the Intel chip.
modeset(G0) is UDL.
>
>>
>> [...]
>> Section "Monitor"
>> Identifier "DVI-I-1-1"
>
> The others are Monitor-*, this one isn't. You probably want this to be
> DVI-I-1, as noted below. I guess you get the extra -1 from seeing it
> as a slaved GPU's output in your current configuration.
Indeed. Fixed.
>
>> Option "AutoServerLayout" "on"
>> Option "Rotate" "normal"
>> EndSection
>>
>> [...]
>>
>> Section "Device"
>> Identifier "UDL"
>> Driver "modesetting"
>> Option "kmsdev" "/dev/dri/card0"
>> Screen 2
>> Option "Monitor-DVI-I-1-1" "DVI-I-1-1"
>
> I think you have an extra -1 in here (and the monitor name doesn't
> exist as per above). And I think the "Screen" index is wrong -- it's
> not what one tends to think it is, as I recall. I think you can just
> drop these lines though.
Without "Screen N" lines, all the outputs are assigned to :0
so the screen layout setup in the ServerLayout section is not
applied properly.
I have read Dave Airlie's patch (that has been accepted into Xorg
1.21.0) more closely, and it indeed binds UDL to DISPLAY=:0
I think this patch needs a followup patch so it would use the
screen ID specified in the Device section.
>
>> EndSection
>>
>> [...]
>>
>> Section "ServerLayout"
>> Identifier "LAYOUT"
>> Option "AutoServerLayout" "on"
>> Screen 0 "SCREEN"
>> Screen 1 "SCREEN1" RightOf "SCREEN"
>> Screen 2 "SCREEN2" RightOf "SCREEN1"
>> EndSection
>>
>> Best regards,
>> Zoltán Böszörményi
>> _______________________________________________
>> dri-devel mailing list
>> dri-devel@lists.freedesktop.org
>> https://lists.freedesktop.org/mailman/listinfo/dri-devel
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-22 15:50 UDL device cannot get its own screen Böszörményi Zoltán
2019-10-22 20:57 ` Ilia Mirkin
@ 2019-10-23 7:42 ` Pekka Paalanen
2019-10-23 12:12 ` Böszörményi Zoltán
1 sibling, 1 reply; 14+ messages in thread
From: Pekka Paalanen @ 2019-10-23 7:42 UTC (permalink / raw)
To: Böszörményi Zoltán; +Cc: xorg, Maling list - DRI developers
[-- Attachment #1.1: Type: text/plain, Size: 1299 bytes --]
On Tue, 22 Oct 2019 17:50:21 +0200
Böszörményi Zoltán <zboszor@pr.hu> wrote:
> Hi,
>
> I have the below configuration for an Intel based POS system that,
> while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
> only two are usable. DP1 for the built-in touchscreen and VGA1 for
> the external VGA connector.
>
> I wanted to use an USB DisplayLink device as the 3rd output, with all
> three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.
...
> The third observation is that while I am using this configuration below,
> so the UDL device should be assigned to :0.2 (and active!), it is really
> assigned to :0[.0] as an inactive output. See that there's no "*" indicator
> set for any of the supported modes on DVI-I-1-1.
>
> How can I set up 3 different Screens correctly for 3 separate fullscreen
> applications?
>
> I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
> patch from Dave Airlie that at least wakes up the UDL device and makes
> it visible without extra magic with providers/sinks.
Hi,
for your specific use case, auto-bind is exactly what you do not want.
So drop the patch or (since the patch is in upstream master already)
use the option it adds to stop auto-binding.
Thanks,
pq
[-- Attachment #1.2: OpenPGP digital signature --]
[-- Type: application/pgp-signature, Size: 833 bytes --]
[-- Attachment #2: Type: text/plain, Size: 159 bytes --]
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-23 7:42 ` Pekka Paalanen
@ 2019-10-23 12:12 ` Böszörményi Zoltán
2019-10-23 12:42 ` Pekka Paalanen
0 siblings, 1 reply; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-10-23 12:12 UTC (permalink / raw)
To: Pekka Paalanen; +Cc: xorg, Maling list - DRI developers
2019. 10. 23. 9:42 keltezéssel, Pekka Paalanen írta:
> On Tue, 22 Oct 2019 17:50:21 +0200
> Böszörményi Zoltán <zboszor@pr.hu> wrote:
>
>> Hi,
>>
>> I have the below configuration for an Intel based POS system that,
>> while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
>> only two are usable. DP1 for the built-in touchscreen and VGA1 for
>> the external VGA connector.
>>
>> I wanted to use an USB DisplayLink device as the 3rd output, with all
>> three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.
>
> ...
>
>> The third observation is that while I am using this configuration below,
>> so the UDL device should be assigned to :0.2 (and active!), it is really
>> assigned to :0[.0] as an inactive output. See that there's no "*" indicator
>> set for any of the supported modes on DVI-I-1-1.
>>
>> How can I set up 3 different Screens correctly for 3 separate fullscreen
>> applications?
>>
>> I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
>> patch from Dave Airlie that at least wakes up the UDL device and makes
>> it visible without extra magic with providers/sinks.
>
> Hi,
>
> for your specific use case, auto-bind is exactly what you do not want.
> So drop the patch or (since the patch is in upstream master already)
> use the option it adds to stop auto-binding.
With Option "AutoBindGPU" "false" in effect (equivalent of backing the
patch out) the UDL device does not get assigned to ANY of the screens.
I want it to have its own :0.2 bit that doesn't happen.
>
>
> Thanks,
> pq
>
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-23 12:12 ` Böszörményi Zoltán
@ 2019-10-23 12:42 ` Pekka Paalanen
0 siblings, 0 replies; 14+ messages in thread
From: Pekka Paalanen @ 2019-10-23 12:42 UTC (permalink / raw)
To: Böszörményi Zoltán; +Cc: xorg, Maling list - DRI developers
[-- Attachment #1.1: Type: text/plain, Size: 1884 bytes --]
On Wed, 23 Oct 2019 14:12:03 +0200
Böszörményi Zoltán <zboszor@pr.hu> wrote:
> 2019. 10. 23. 9:42 keltezéssel, Pekka Paalanen írta:
> > On Tue, 22 Oct 2019 17:50:21 +0200
> > Böszörményi Zoltán <zboszor@pr.hu> wrote:
> >
> >> Hi,
> >>
> >> I have the below configuration for an Intel based POS system that,
> >> while advertises 3 outputs (DP1, VGA1 and HDMI1 with xf86-video-intel),
> >> only two are usable. DP1 for the built-in touchscreen and VGA1 for
> >> the external VGA connector.
> >>
> >> I wanted to use an USB DisplayLink device as the 3rd output, with all
> >> three output using its own Screen number, i.e. :0.0 :0.1 and :0.2.
> >
> > ...
> >
> >> The third observation is that while I am using this configuration below,
> >> so the UDL device should be assigned to :0.2 (and active!), it is really
> >> assigned to :0[.0] as an inactive output. See that there's no "*" indicator
> >> set for any of the supported modes on DVI-I-1-1.
> >>
> >> How can I set up 3 different Screens correctly for 3 separate fullscreen
> >> applications?
> >>
> >> I am using Xorg 1.20.4 patched with the "autobind GPUs to the screen"
> >> patch from Dave Airlie that at least wakes up the UDL device and makes
> >> it visible without extra magic with providers/sinks.
> >
> > Hi,
> >
> > for your specific use case, auto-bind is exactly what you do not want.
> > So drop the patch or (since the patch is in upstream master already)
> > use the option it adds to stop auto-binding.
>
> With Option "AutoBindGPU" "false" in effect (equivalent of backing the
> patch out) the UDL device does not get assigned to ANY of the screens.
>
> I want it to have its own :0.2 bit that doesn't happen.
Yes, that's another problem. Autobind=false is a step in the right
direction, but apparently not sufficient.
Thanks,
pq
[-- Attachment #1.2: OpenPGP digital signature --]
[-- Type: application/pgp-signature, Size: 833 bytes --]
[-- Attachment #2: Type: text/plain, Size: 159 bytes --]
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-23 6:41 ` Böszörményi Zoltán
@ 2019-10-23 13:32 ` Ilia Mirkin
2019-11-05 14:22 ` Böszörményi Zoltán
0 siblings, 1 reply; 14+ messages in thread
From: Ilia Mirkin @ 2019-10-23 13:32 UTC (permalink / raw)
To: Böszörményi Zoltán; +Cc: xorg, Maling list - DRI developers
On Wed, Oct 23, 2019 at 2:41 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>
> 2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:
> > On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
> >> Section "Device"
> >> Identifier "UDL"
> >> Driver "modesetting"
> >> Option "kmsdev" "/dev/dri/card0"
> >> Screen 2
> >> Option "Monitor-DVI-I-1-1" "DVI-I-1-1"
> >
> > I think you have an extra -1 in here (and the monitor name doesn't
> > exist as per above). And I think the "Screen" index is wrong -- it's
> > not what one tends to think it is, as I recall. I think you can just
> > drop these lines though.
>
> Without "Screen N" lines, all the outputs are assigned to :0
> so the screen layout setup in the ServerLayout section is not
> applied properly.
>
As I remember it, the Screen here is for ZaphodHeads-type
configurations, and it indicates which head you're supposed to use of
the underlying device. My suggestion was to only remove it here, not
everywhere.
-ilia
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-10-23 13:32 ` Ilia Mirkin
@ 2019-11-05 14:22 ` Böszörményi Zoltán
2019-11-12 14:23 ` Böszörményi Zoltán
0 siblings, 1 reply; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-11-05 14:22 UTC (permalink / raw)
To: Ilia Mirkin; +Cc: xorg, Maling list - DRI developers
Hi,
2019. 10. 23. 15:32 keltezéssel, Ilia Mirkin írta:
> On Wed, Oct 23, 2019 at 2:41 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>
>> 2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:
>>> On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>>> Section "Device"
>>>> Identifier "UDL"
>>>> Driver "modesetting"
>>>> Option "kmsdev" "/dev/dri/card0"
>>>> Screen 2
>>>> Option "Monitor-DVI-I-1-1" "DVI-I-1-1"
>>>
>>> I think you have an extra -1 in here (and the monitor name doesn't
>>> exist as per above). And I think the "Screen" index is wrong -- it's
>>> not what one tends to think it is, as I recall. I think you can just
>>> drop these lines though.
>>
>> Without "Screen N" lines, all the outputs are assigned to :0
>> so the screen layout setup in the ServerLayout section is not
>> applied properly.
>>
>
> As I remember it, the Screen here is for ZaphodHeads-type
> configurations, and it indicates which head you're supposed to use of
> the underlying device. My suggestion was to only remove it here, not
> everywhere.
Okay, but it still doesn't create a working setup.
In the meantime I switched to the GIT version of Xorg, but
it didn't make a difference (for now).
I decided to start from a mostly clean configuration, whatever default
settings or drivers are used. It's modesetting across the board.
The configuration file has just this:
=====================================
Section "ServerFlags"
Option "AutoBindGPU" "true/false"
EndSection
=====================================
Xorg.0.log has these lines (same as 1.20.4), regardless of the AutoBindGPU setting:
... all 3 monitors' EDID data is read and okay ...
[ 879.136] (II) modeset(G0): Damage tracking initialized
[ 879.140] (II) modeset(0): Damage tracking initialized
[ 879.140] (II) modeset(0): Setting screen physical size to 609 x 270
modeset(G0) is UDL and there is no "screen physical size" set for it.
xrandr shows 3 outputs for Intel and 1 for UDL. UDL doesn't have an active mode set:
# DISPLAY=:0 xrandr
Screen 0: minimum 320 x 200, current 2304 x 1024, maximum 8192 x 8192
VGA-1 connected primary 1280x1024+0+0 (normal left inverted right x axis y axis) 376mm x 301mm
1280x1024 60.02*+
1152x864 75.00
1024x768 75.03 60.00
832x624 74.55
800x600 75.00 60.32
640x480 75.00 59.94
720x400 70.08
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-1 connected 1024x768+1280+0 (normal left inverted right x axis y axis) 304mm x 228mm
1024x768 60.00*+
DVI-I-1-1 connected (normal left inverted right x axis y axis)
1920x1080 60.00 +
1680x1050 59.88
1280x1024 75.02 60.02
1440x900 74.98 59.90
1280x720 60.00
1024x768 75.03 60.00
800x600 75.00 60.32
640x480 75.00 72.81 66.67 59.94
720x400 70.08
1280x1024 (0x44) 108.000MHz +HSync +VSync
h: width 1280 start 1328 end 1440 total 1688 skew 0 clock 63.98KHz
v: height 1024 start 1025 end 1028 total 1066 clock 60.02Hz
1024x768 (0x48) 78.750MHz +HSync +VSync
h: width 1024 start 1040 end 1136 total 1312 skew 0 clock 60.02KHz
v: height 768 start 769 end 772 total 800 clock 75.03Hz
1024x768 (0x49) 65.000MHz -HSync -VSync
h: width 1024 start 1048 end 1184 total 1344 skew 0 clock 48.36KHz
v: height 768 start 771 end 777 total 806 clock 60.00Hz
800x600 (0x4a) 49.500MHz +HSync +VSync
h: width 800 start 816 end 896 total 1056 skew 0 clock 46.88KHz
v: height 600 start 601 end 604 total 625 clock 75.00Hz
800x600 (0x4b) 40.000MHz +HSync +VSync
h: width 800 start 840 end 968 total 1056 skew 0 clock 37.88KHz
v: height 600 start 601 end 605 total 628 clock 60.32Hz
640x480 (0x4c) 31.500MHz -HSync -VSync
h: width 640 start 656 end 720 total 840 skew 0 clock 37.50KHz
v: height 480 start 481 end 484 total 500 clock 75.00Hz
640x480 (0x4f) 25.175MHz -HSync -VSync
h: width 640 start 656 end 752 total 800 skew 0 clock 31.47KHz
v: height 480 start 490 end 492 total 525 clock 59.94Hz
720x400 (0x50) 28.320MHz -HSync +VSync
h: width 720 start 738 end 846 total 900 skew 0 clock 31.47KHz
v: height 400 start 412 end 414 total 449 clock 70.08Hz
#
I can't actually set a mode for it manually:
# DISPLAY=:0 xrandr --output DVI-I-1-1 --mode 1280x1024
xrandr: Configure crtc 2 failed
So, for some reason, while the output is there, the monitor is detected
via EDID, there is no CRTC assigned to it.
With AutoBindGPU=false, the UDL device is not actually activated,
despite the lines present about modeset(G0) with EDID detected and so on.
# DISPLAY=:0 xrandr
Screen 0: minimum 320 x 200, current 2304 x 1024, maximum 8192 x 8192
VGA-1 connected primary 1280x1024+0+0 (normal left inverted right x axis y axis) 376mm x 301mm
1280x1024 60.02*+
1152x864 75.00
1024x768 75.03 60.00
832x624 74.55
800x600 75.00 60.32
640x480 75.00 59.94
720x400 70.08
HDMI-1 disconnected (normal left inverted right x axis y axis)
DP-1 connected 1024x768+1280+0 (normal left inverted right x axis y axis) 304mm x 228mm
1024x768 60.00*+
#
The explicit mode setting fails with a different (but expected) error:
# DISPLAY=:0 xrandr --output DVI-I-1-1 --mode 1280x1024
warning: output DVI-I-1-1 not found; ignoring
Something is wrong somewhere in the server code.
Best regards,
Zoltán Böszörményi
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-11-05 14:22 ` Böszörményi Zoltán
@ 2019-11-12 14:23 ` Böszörményi Zoltán
2019-11-12 16:41 ` Ilia Mirkin
0 siblings, 1 reply; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-11-12 14:23 UTC (permalink / raw)
To: Ilia Mirkin; +Cc: xorg, Maling list - DRI developers
2019. 11. 05. 15:22 keltezéssel, Böszörményi Zoltán írta:
> Hi,
>
> 2019. 10. 23. 15:32 keltezéssel, Ilia Mirkin írta:
>> On Wed, Oct 23, 2019 at 2:41 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>>
>>> 2019. 10. 22. 22:57 keltezéssel, Ilia Mirkin írta:
>>>> On Tue, Oct 22, 2019 at 11:50 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>>>> Section "Device"
>>>>> Identifier "UDL"
>>>>> Driver "modesetting"
>>>>> Option "kmsdev" "/dev/dri/card0"
>>>>> Screen 2
>>>>> Option "Monitor-DVI-I-1-1" "DVI-I-1-1"
>>>>
>>>> I think you have an extra -1 in here (and the monitor name doesn't
>>>> exist as per above). And I think the "Screen" index is wrong -- it's
>>>> not what one tends to think it is, as I recall. I think you can just
>>>> drop these lines though.
>>>
>>> Without "Screen N" lines, all the outputs are assigned to :0
>>> so the screen layout setup in the ServerLayout section is not
>>> applied properly.
>>>
>>
>> As I remember it, the Screen here is for ZaphodHeads-type
>> configurations, and it indicates which head you're supposed to use of
>> the underlying device. My suggestion was to only remove it here, not
>> everywhere.
>
> Okay, but it still doesn't create a working setup.
So, finally I got back into experimenting with this.
I have read "man 5 xorg.conf" more closely and found option
GPUDevice in Section "Screen". Here's the configuration I came up
with but it still doesn't work:
==============================================
Section "ServerFlags"
Option "AutoBindGPU" "false"
EndSection
Section "Monitor"
Identifier "Monitor-DP-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Monitor"
Identifier "Monitor-VGA-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Monitor"
Identifier "Monitor-HDMI-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Monitor"
Identifier "Monitor-DVI-I-1"
Option "AutoServerLayout" "on"
Option "Rotate" "normal"
EndSection
Section "Device"
Identifier "Intel0"
Driver "modesetting"
BusID "PCI:0:2:0"
Screen 0
Option "Monitor-DP-1" "DP-1"
Option "ZaphodHeads" "DP-1"
EndSection
Section "Device"
Identifier "Intel1"
Driver "modesetting"
BusID "PCI:0:2:0"
Screen 1
Option "Monitor-VGA-1" "VGA-1"
Option "ZaphodHeads" "VGA-1"
EndSection
Section "Device"
Identifier "Intel2"
Driver "modesetting"
BusID "PCI:0:2:0"
Screen 2
Option "Monitor-HDMI-1" "HDMI-1"
Option "ZaphodHeads" "HDMI-1"
EndSection
Section "Device"
Identifier "UDL"
Driver "modesetting"
Option "kmsdev" "/dev/dri/card0"
# Suggestion of Ilia Mirkin: Don't set Screen here
#Screen 2
Option "Monitor-DVI-I-1" "DVI-I-1"
Option "ZaphodHeads" "DVI-I-1"
EndSection
Section "Screen"
Identifier "SCREEN"
Option "AutoServerLayout" "on"
Device "Intel0"
Monitor "Monitor-DP1"
SubSection "Display"
Modes "1024x768"
Depth 24
EndSubSection
EndSection
Section "Screen"
Identifier "SCREEN1"
Option "AutoServerLayout" "on"
Device "Intel1"
Monitor "Monitor-VGA1"
SubSection "Display"
Modes "1024x768"
Depth 24
EndSubSection
EndSection
Section "Screen"
Identifier "SCREEN2"
Option "AutoServerLayout" "on"
Device "UDL"
GPUDevice "Intel2"
Monitor "Monitor-DVI-I-1"
SubSection "Display"
Modes "1024x768"
Depth 24
EndSubSection
EndSection
Section "ServerLayout"
Identifier "LAYOUT"
Option "AutoServerLayout" "on"
Screen 0 "SCREEN"
Screen 1 "SCREEN1" RightOf "SCREEN"
Screen 2 "SCREEN2" RightOf "SCREEN1"
EndSection
==============================================
Obviously, I want *some* GPU acceleration that does its work
over the UDL framebuffer.
With the above setup, I get these:
# DISPLAY=:0 xrandr --listproviders
Providers: number : 2
Provider 0: id: 0x40 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload
crtcs: 1 outputs: 1 associated providers: 0 name:modesetting
Provider 1: id: 0xac cap: 0x2, Sink Output crtcs: 1 outputs: 1 associated providers: 0
name:modesetting
# DISPLAY=:0.1 xrandr --listproviders
Providers: number : 1
Provider 0: id: 0x72 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload
crtcs: 1 outputs: 1 associated providers: 0 name:modesetting
# DISPLAY=:0.2 xrandr --listproviders
Can't open display :0.2
According to /var/log/Xorg.0.log, I have:
[ 1917.884] (II) modeset(0): using drv /dev/dri/card1
[ 1917.884] (II) modeset(1): using drv /dev/dri/card1
[ 1917.884] (II) modeset(G0): using drv /dev/dri/card1
[ 1917.884] (II) modeset(G1): using drv /dev/dri/card0
modeset(0) is the Intel DP-1 output, monitor attached, EDID detected
modeset(1) is the Intel VGA-1 output, monitor attached, EDID detected
modeset(G0) is the Intel HDMI-1 output, no monitor, no EDID
modeset(G1) is the UDL device, monitor attached, EDID detected
However:
[ 1918.521] (II) modeset(G0): Damage tracking initialized
[ 1918.525] (II) modeset(0): Damage tracking initialized
[ 1918.525] (II) modeset(0): Setting screen physical size to 270 x 203
[ 1918.528] (II) modeset(1): Damage tracking initialized
[ 1918.528] (II) modeset(1): Setting screen physical size to 270 x 203
Most notably, there is no "modeset(G1): Setting screen physical size" message,
despite EDID was detected, and supposedly properly, since the monitor name
is right.
FYI, inverting the roles of UDL and Intel2 does not work either, i.e. using:
Device "Intel2"
GPUDevice "UDL"
This way, at least there's a DISPLAY=:0.2 screen (Intel2) but still
no working UDL. Also, xrandr --listproviders insists on showing the UDL
provider line for DISPLAY=:0 instead of DISPLAY=:0.2.
> In the meantime I switched to the GIT version of Xorg, but
> it didn't make a difference (for now).
Current GIT commit 562c7888be538c4d043ec1f374a9d9afa0b305a4, plus applied MRs:
* 155 (USB device prefix handling),
* 325 (reorder ScreenInit) and
* 326 (more robust hotplug GPU handling).
I have also this patch to see whether the auto-bound GPUs actually have
their screen numbers set correctly according to the ServerLayout section:
diff --git a/hw/xfree86/common/xf86Init.c b/hw/xfree86/common/xf86Init.c
index 6cc2f0b01..3e21644fe 100644
--- a/hw/xfree86/common/xf86Init.c
+++ b/hw/xfree86/common/xf86Init.c
@@ -210,9 +210,29 @@ xf86AutoConfigOutputDevices(void)
if (!xf86Info.autoBindGPU)
return;
- for (i = 0; i < xf86NumGPUScreens; i++)
+ xf86ErrorFVerb(0, "xf86AutoConfigOutputDevices: xf86NumScreens %d xf86NumGPUScreens
%d\n", xf86NumScreens, xf86NumGPUScreens);
+ for (i = 0; i < xf86NumGPUScreens; i++) {
+ xf86ErrorFVerb(0,
+ "xf86AutoConfigOutputDevices: GPU #%d driver '%s' '%s' "
+ "scrnIndex %d origIndex %d pScreen->myNum %d confScreen->screennum %d "
+ "confScreen->device->identifier '%s' "
+ "confScreen->device->screen %d confScreen->device->myScreenSection->screennum %d "
+ "confScreen->device->myScreenSection->device->screen %d\n",
+ i,
+ xf86GPUScreens[i]->driverName,
+ xf86GPUScreens[i]->name,
+ xf86GPUScreens[i]->scrnIndex,
+ xf86GPUScreens[i]->origIndex,
+ xf86GPUScreens[i]->pScreen->myNum,
+ xf86GPUScreens[i]->confScreen->screennum,
+ xf86GPUScreens[i]->confScreen->device->identifier,
+ xf86GPUScreens[i]->confScreen->device->screen,
+ xf86GPUScreens[i]->confScreen->device->myScreenSection->screennum,
+ xf86GPUScreens[i]->confScreen->device->myScreenSection->device->screen
+ );
RRProviderAutoConfigGpuScreen(xf86ScrnToScreen(xf86GPUScreens[i]),
xf86ScrnToScreen(xf86Screens[0]));
+ }
}
static void
But no, all GPU devices (now only one, the UDL device) have screen 0
(a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:
[ 2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
[ 2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' scrnIndex
256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
confScreen->device->identifier 'Intel0'
confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
confScreen->device->myScreenSection->device->screen 0
Somehow, Option "Device" should ensure that the UDL device is actually
treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
instead of modeset(Gn)) and it should be woken up automatically.
This is what AutoBindGPU is supposed to do, isn't it?
But instead of assigning to screen 0, it should be assigned to whatever
screen number it is configured as.
I know it's not a common use case nowadays, but I really want separate
fullscreen apps on their independent screens, including a standalone UDL
device, instead of having the latters as a Xinerama extension to some
other device.
Best regards,
Zoltán Böszörményi
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply related [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-11-12 14:23 ` Böszörményi Zoltán
@ 2019-11-12 16:41 ` Ilia Mirkin
2019-11-13 16:58 ` Böszörményi Zoltán
0 siblings, 1 reply; 14+ messages in thread
From: Ilia Mirkin @ 2019-11-12 16:41 UTC (permalink / raw)
To: Böszörményi Zoltán; +Cc: xorg, Maling list - DRI developers
On Tue, Nov 12, 2019 at 9:23 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
> But no, all GPU devices (now only one, the UDL device) have screen 0
> (a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:
>
> [ 2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
> [ 2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' scrnIndex
> 256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
> confScreen->device->identifier 'Intel0'
> confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
> confScreen->device->myScreenSection->device->screen 0
>
> Somehow, Option "Device" should ensure that the UDL device is actually
> treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
> instead of modeset(Gn)) and it should be woken up automatically.
>
> This is what AutoBindGPU is supposed to do, isn't it?
>
> But instead of assigning to screen 0, it should be assigned to whatever
> screen number it is configured as.
>
> I know it's not a common use case nowadays, but I really want separate
> fullscreen apps on their independent screens, including a standalone UDL
> device, instead of having the latters as a Xinerama extension to some
> other device.
If you see a "G", that means it's being treated as a GPU device, which
is *not* what you want if you want separate screens. You need to try
to convince things to *not* set the devices up as GPU devices, but
instead put each device (and each one of its heads, via ZaphodHeads)
no a separate device, which in turn will have a separate screen.
-ilia
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-11-12 16:41 ` Ilia Mirkin
@ 2019-11-13 16:58 ` Böszörményi Zoltán
2019-11-13 17:25 ` Ilia Mirkin
0 siblings, 1 reply; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-11-13 16:58 UTC (permalink / raw)
To: Ilia Mirkin; +Cc: xorg, Maling list - DRI developers
2019. 11. 12. 17:41 keltezéssel, Ilia Mirkin írta:
> On Tue, Nov 12, 2019 at 9:23 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>> But no, all GPU devices (now only one, the UDL device) have screen 0
>> (a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:
>>
>> [ 2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
>> [ 2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' scrnIndex
>> 256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
>> confScreen->device->identifier 'Intel0'
>> confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
>> confScreen->device->myScreenSection->device->screen 0
>>
>> Somehow, Option "Device" should ensure that the UDL device is actually
>> treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
>> instead of modeset(Gn)) and it should be woken up automatically.
>>
>> This is what AutoBindGPU is supposed to do, isn't it?
>>
>> But instead of assigning to screen 0, it should be assigned to whatever
>> screen number it is configured as.
>>
>> I know it's not a common use case nowadays, but I really want separate
>> fullscreen apps on their independent screens, including a standalone UDL
>> device, instead of having the latters as a Xinerama extension to some
>> other device.
>
> If you see a "G", that means it's being treated as a GPU device, which
> is *not* what you want if you want separate screens. You need to try
> to convince things to *not* set the devices up as GPU devices, but
> instead put each device (and each one of its heads, via ZaphodHeads)
> no a separate device, which in turn will have a separate screen.
I created a merge request that finally made it possible what I wanted.
https://gitlab.freedesktop.org/xorg/xserver/merge_requests/334
Now, no matter if I use the intel or modesetting drivers for the
Device sections using the Intel heads, or AutoBindGPU set to true or
false, the UDL device is correctly matched with its Option "kmsdev"
setting to the plaform device's device path.
This patch seems to be a slight layering violation, but since the
modesetting driver is built into the Xorg server sources, the patch
may get away with it.
Best regards,
Zoltán Böszörményi
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-11-13 16:58 ` Böszörményi Zoltán
@ 2019-11-13 17:25 ` Ilia Mirkin
2019-11-13 18:08 ` Böszörményi Zoltán
0 siblings, 1 reply; 14+ messages in thread
From: Ilia Mirkin @ 2019-11-13 17:25 UTC (permalink / raw)
To: Böszörményi Zoltán; +Cc: xorg, Maling list - DRI developers
On Wed, Nov 13, 2019 at 11:59 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>
> 2019. 11. 12. 17:41 keltezéssel, Ilia Mirkin írta:
> > On Tue, Nov 12, 2019 at 9:23 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
> >> But no, all GPU devices (now only one, the UDL device) have screen 0
> >> (a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:
> >>
> >> [ 2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
> >> [ 2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' scrnIndex
> >> 256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
> >> confScreen->device->identifier 'Intel0'
> >> confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
> >> confScreen->device->myScreenSection->device->screen 0
> >>
> >> Somehow, Option "Device" should ensure that the UDL device is actually
> >> treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
> >> instead of modeset(Gn)) and it should be woken up automatically.
> >>
> >> This is what AutoBindGPU is supposed to do, isn't it?
> >>
> >> But instead of assigning to screen 0, it should be assigned to whatever
> >> screen number it is configured as.
> >>
> >> I know it's not a common use case nowadays, but I really want separate
> >> fullscreen apps on their independent screens, including a standalone UDL
> >> device, instead of having the latters as a Xinerama extension to some
> >> other device.
> >
> > If you see a "G", that means it's being treated as a GPU device, which
> > is *not* what you want if you want separate screens. You need to try
> > to convince things to *not* set the devices up as GPU devices, but
> > instead put each device (and each one of its heads, via ZaphodHeads)
> > no a separate device, which in turn will have a separate screen.
>
> I created a merge request that finally made it possible what I wanted.
>
> https://gitlab.freedesktop.org/xorg/xserver/merge_requests/334
>
> Now, no matter if I use the intel or modesetting drivers for the
> Device sections using the Intel heads, or AutoBindGPU set to true or
> false, the UDL device is correctly matched with its Option "kmsdev"
> setting to the plaform device's device path.
>
> This patch seems to be a slight layering violation, but since the
> modesetting driver is built into the Xorg server sources, the patch
> may get away with it.
Have you looked at setting AutoAddGPU to false? AutoBindGPU is too
late -- that's when you already have a GPU, whether to bind it to the
primary device (/screen/whatever). You need to not have a GPU in the
first place.
-ilia
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-11-13 17:25 ` Ilia Mirkin
@ 2019-11-13 18:08 ` Böszörményi Zoltán
2019-11-14 8:50 ` Böszörményi Zoltán
0 siblings, 1 reply; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-11-13 18:08 UTC (permalink / raw)
To: Ilia Mirkin; +Cc: xorg, Maling list - DRI developers
2019. 11. 13. 18:25 keltezéssel, Ilia Mirkin írta:
> On Wed, Nov 13, 2019 at 11:59 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>
>> 2019. 11. 12. 17:41 keltezéssel, Ilia Mirkin írta:
>>> On Tue, Nov 12, 2019 at 9:23 AM Böszörményi Zoltán <zboszor@pr.hu> wrote:
>>>> But no, all GPU devices (now only one, the UDL device) have screen 0
>>>> (a.k.a. DISPLAY=:0.0) set when AutoBindGPU is true:
>>>>
>>>> [ 2444.576] xf86AutoConfigOutputDevices: xf86NumScreens 2 xf86NumGPUScreens 1
>>>> [ 2444.576] xf86AutoConfigOutputDevices: GPU #0 driver 'modesetting' 'modeset' scrnIndex
>>>> 256 origIndex 257 pScreen->myNum 256 confScreen->screennum 0
>>>> confScreen->device->identifier 'Intel0'
>>>> confScreen->device->screen 0 confScreen->device->myScreenSection->screennum 0
>>>> confScreen->device->myScreenSection->device->screen 0
>>>>
>>>> Somehow, Option "Device" should ensure that the UDL device is actually
>>>> treated as a framebuffer that can be rendered into (i.e. to be modeset(2)
>>>> instead of modeset(Gn)) and it should be woken up automatically.
>>>>
>>>> This is what AutoBindGPU is supposed to do, isn't it?
>>>>
>>>> But instead of assigning to screen 0, it should be assigned to whatever
>>>> screen number it is configured as.
>>>>
>>>> I know it's not a common use case nowadays, but I really want separate
>>>> fullscreen apps on their independent screens, including a standalone UDL
>>>> device, instead of having the latters as a Xinerama extension to some
>>>> other device.
>>>
>>> If you see a "G", that means it's being treated as a GPU device, which
>>> is *not* what you want if you want separate screens. You need to try
>>> to convince things to *not* set the devices up as GPU devices, but
>>> instead put each device (and each one of its heads, via ZaphodHeads)
>>> no a separate device, which in turn will have a separate screen.
>>
>> I created a merge request that finally made it possible what I wanted.
>>
>> https://gitlab.freedesktop.org/xorg/xserver/merge_requests/334
>>
>> Now, no matter if I use the intel or modesetting drivers for the
>> Device sections using the Intel heads, or AutoBindGPU set to true or
>> false, the UDL device is correctly matched with its Option "kmsdev"
>> setting to the plaform device's device path.
>>
>> This patch seems to be a slight layering violation, but since the
>> modesetting driver is built into the Xorg server sources, the patch
>> may get away with it.
>
> Have you looked at setting AutoAddGPU to false? AutoBindGPU is too
> late -- that's when you already have a GPU, whether to bind it to the
> primary device (/screen/whatever). You need to not have a GPU in the
> first place.
Yes, I tried AutoAddGPU=false. Then the UDL device was not set up at all.
What I noticed in debugging Xorg via GDB is that the UDL device was
matched to the wrong platform device in xf86platformProbeDev.
xf86_platform_devices[0] == Intel, /dev/dri/card1, primary platform device
xf86_platform_devices[1] == UDL, /dev/dri/card0
devList[0] == "Intel0"
devList[1] == "Intel1"
devList[2] == "UDL"
devList[3] == "Intel2" (GPU device)
Since the device path was not matched and the PCI ID did not match,
(after all, the UDL device is NOT PCI), this code was executed:
else {
/* for non-seat0 servers assume first device is the master */
if (ServerIsNotSeat0())
break;
if (xf86IsPrimaryPlatform(&xf86_platform_devices[j]))
break;
}
So, probeSingleDevice() was called with xf86_platform_devices[0] and
devList[2], resulting in the UDL device set up as a GPU device and not
a framebuffer on its own right.
My MR modifies this so if there is an explicit Option "kmsdev" setting,
it's matched first. The final else branch is only executed in the default
case with no explicit configuration.
With this MR, the explicit configuration for UDL works, regardless the
AutoBindGPU value.
Best regards,
Zoltán Böszörményi
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
* Re: UDL device cannot get its own screen
2019-11-13 18:08 ` Böszörményi Zoltán
@ 2019-11-14 8:50 ` Böszörményi Zoltán
0 siblings, 0 replies; 14+ messages in thread
From: Böszörményi Zoltán @ 2019-11-14 8:50 UTC (permalink / raw)
To: Ilia Mirkin; +Cc: xorg, Maling list - DRI developers
2019. 11. 13. 19:08 keltezéssel, Böszörményi Zoltán írta:
> 2019. 11. 13. 18:25 keltezéssel, Ilia Mirkin írta:
>>
>> Have you looked at setting AutoAddGPU to false? AutoBindGPU is too
>> late -- that's when you already have a GPU, whether to bind it to the
>> primary device (/screen/whatever). You need to not have a GPU in the
>> first place.
>
> Yes, I tried AutoAddGPU=false. Then the UDL device was not set up at all.
>
> What I noticed in debugging Xorg via GDB is that the UDL device was
> matched to the wrong platform device in xf86platformProbeDev.
> [long details deleted]
Now the xserver MR is at https://gitlab.freedesktop.org/xorg/xserver/merge_requests/335
with explaining the same as I wrote in the previous mail in the commit message.
I have also created https://gitlab.freedesktop.org/xorg/xserver/merge_requests/336
to fix the same issue when using BusID for the UDL device.
Best regards,
Zoltán Böszörményi
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel
^ permalink raw reply [flat|nested] 14+ messages in thread
end of thread, other threads:[~2019-11-14 8:50 UTC | newest]
Thread overview: 14+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2019-10-22 15:50 UDL device cannot get its own screen Böszörményi Zoltán
2019-10-22 20:57 ` Ilia Mirkin
2019-10-23 6:41 ` Böszörményi Zoltán
2019-10-23 13:32 ` Ilia Mirkin
2019-11-05 14:22 ` Böszörményi Zoltán
2019-11-12 14:23 ` Böszörményi Zoltán
2019-11-12 16:41 ` Ilia Mirkin
2019-11-13 16:58 ` Böszörményi Zoltán
2019-11-13 17:25 ` Ilia Mirkin
2019-11-13 18:08 ` Böszörményi Zoltán
2019-11-14 8:50 ` Böszörményi Zoltán
2019-10-23 7:42 ` Pekka Paalanen
2019-10-23 12:12 ` Böszörményi Zoltán
2019-10-23 12:42 ` Pekka Paalanen
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).