Bug 201795 - [Regression] Wrong 4k resolution detected with DisplayPort to HDMI adapter on amdgpu
Summary: [Regression] Wrong 4k resolution detected with DisplayPort to HDMI adapter on...
Status: NEW
Alias: None
Product: Drivers
Classification: Unclassified
Component: Video(DRI - non Intel) (show other bugs)
Hardware: All Linux
: P1 normal
Assignee: drivers_video-dri
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2018-11-27 19:30 UTC by thomas.lassdiesonnerein
Modified: 2019-03-18 13:45 UTC (History)
3 users (show)

See Also:
Kernel Version: 4.19.8
Tree: Mainline
Regression: Yes


Attachments
xrandr output on kernel 4.14 (1.53 KB, text/plain)
2018-11-27 19:30 UTC, thomas.lassdiesonnerein
Details
xrandr output on kernel 4.19 (1.40 KB, text/plain)
2018-11-27 19:35 UTC, thomas.lassdiesonnerein
Details
dmesg of kernel 4.14 (73.41 KB, text/plain)
2018-11-28 19:23 UTC, thomas.lassdiesonnerein
Details
dmesg of kernel 4.19 (78.57 KB, text/plain)
2018-11-28 19:23 UTC, thomas.lassdiesonnerein
Details
xorg log of kernel 4.14 (66.22 KB, text/plain)
2018-11-28 19:24 UTC, thomas.lassdiesonnerein
Details
xorg log of kernel 4.19 (60.21 KB, text/plain)
2018-11-28 19:24 UTC, thomas.lassdiesonnerein
Details

Description thomas.lassdiesonnerein 2018-11-27 19:30:46 UTC
Created attachment 279677 [details]
xrandr output on kernel 4.14

I have a AMD Fury X GPU connected to two 4k monitors.
- one 4k iiyama monitor @DP 1.4
- one 4k LG OLED TV @DP 1.4 to HDMI 2.0 Adapter

both have physical 3840x2160 pixels.

On Kernel 4.14 (Manjaro) I can mirror the screen fine. 

Newer Kernels I tested (4.17 to 4.19) mirror too, but on the LG OLED TV with 4096x2160 instead of 3840x2160. The latter res then is not stretched correctly. I have small black bars left and right as if the driver thinks I would have 4096x2160 physically.If I set 4096x2160 in plasma5 it stretches fine, but then I have different resolutions on boths screens which is problematic while mirroring.
Comment 1 thomas.lassdiesonnerein 2018-11-27 19:35:44 UTC
Created attachment 279679 [details]
xrandr output on kernel 4.19
Comment 2 thomas.lassdiesonnerein 2018-11-27 19:38:14 UTC
When I compare the two xrandr outputs the only difference are the missing interlaced resolutions on Kernel 4.19. Maybe that is a hint where the problem may be?
Comment 3 Michel Dänzer 2018-11-28 09:24:04 UTC
Please also attach the output of dmesg and the Xorg log file in each case.
Comment 4 thomas.lassdiesonnerein 2018-11-28 19:23:01 UTC
Created attachment 279717 [details]
dmesg of kernel 4.14
Comment 5 thomas.lassdiesonnerein 2018-11-28 19:23:46 UTC
Created attachment 279719 [details]
dmesg of kernel 4.19
Comment 6 thomas.lassdiesonnerein 2018-11-28 19:24:17 UTC
Created attachment 279721 [details]
xorg log of kernel 4.14
Comment 7 thomas.lassdiesonnerein 2018-11-28 19:24:55 UTC
Created attachment 279723 [details]
xorg log of kernel 4.19
Comment 8 Michel Dänzer 2018-11-29 08:39:05 UTC
FWIW, amdgpu.dc=0 on the kernel command line might serve as a workaround.
Comment 9 thomas.lassdiesonnerein 2018-11-29 10:04:37 UTC
Thanks but I'll stay on 4.14 til there is a proper fix :)
If I can help further I would happily do so - with the exception of writing code...
Comment 10 thomas.lassdiesonnerein 2018-12-15 07:12:38 UTC
Still there with 4.19.8
I know it is probably low priority. Just keeping this report up to date.
Comment 11 fin4478 2018-12-15 16:53:49 UTC
There is very little happening for amdgpu and radeon drivers in mainline kernels. See the diff column at kernel.org and compare this:
https://cgit.freedesktop.org/~agd5f/linux/log/?h=drm-next-4.21-wip

So, use a rolling release OS, the Amd wip kernel and Mesa git, like Oibap ppa.
Comment 12 thomas.lassdiesonnerein 2019-01-18 19:00:19 UTC
@fin4478
Read. I use rolling. I won't use wip stuff. 

@Michael Dänzer
The bug is still there with Kernel 4.20.1 and 4.19.14.
With 4.14.92 everything is fine.
Comment 13 tempel.julian 2019-01-23 11:19:49 UTC
As a workaround, I'd give forcing the correct edid for each display a try:
https://wiki.archlinux.org/index.php/kernel_mode_setting#Forcing_modes_and_EDID
And maybe also edit the edid to offer only one correct native mode you want to use.
Editing edids on Linux might be painful, so perhaps do it on Windows.
Comment 14 thomas.lassdiesonnerein 2019-02-21 08:40:53 UTC
Still there with 4.20.10

@tempel.julian@gmail.com
I do not need a workaround thx. All works fine with 4.14.101 LTS Kernel. I just wanted to report this regression and hope for a fix in another LTS Kernel 4.19 or supposedly 5.4.
Comment 15 thomas.lassdiesonnerein 2019-03-13 11:00:31 UTC
Still there with 5.0.1
Comment 16 Nicholas Kazlauskas 2019-03-13 13:07:25 UTC
By default, amdgpu DC doesn't perform any sort of display scaling. It relies on the monitor/TV's scaler itself to do the work, which might be producing the black bars.

When you do set the scaling mode via the connector property "scaling mode", the driver will perform the scaling itself. However, this scaling is done on whatever the preferred mode for the monitor is.

I believe that the preferred mode for your TV is 4096x2160:

"4096x2160" 30 297000 4096 4184 4272 4400 2160 2168 2178 2250 0x40 0x5

This should be the timing used when display scaling is done by the driver itself. You might get mirroring to work if you set the connector scaling mode property, eg:

xrandr --output DisplayPort-0 --set "scaling mode" "Full"

and then try changing the mode on the display or mirroring.

This setting can persist in your xorg.conf as well, I believe it looks something like:

Section "Monitor"
    Identifier "DisplayPort-0"
    Option "scaling mode" "Full"
EndSection
Comment 17 thomas.lassdiesonnerein 2019-03-13 19:13:24 UTC
@Nicholas Kazlauskas
Thx for the possible workaround.
I am however not in search for a workaround. This is a regression (4.14 works) and I hope it will be fixed properly some day.
Comment 18 Nicholas Kazlauskas 2019-03-13 20:02:09 UTC
It's intended driver behavior, not a regression.

The driver performs no display scaling by default since it forces the timing for the preferred mode of the display. This means that refresh rate would effectively be ignored - forcing the user to manually set "scaling mode" to "off" in order to use a refresh rate different from the preferred mode. It's certainly not very intuitive default behavior.

There's also good reason to favor using the preferred mode by default for driver scaling - it's the most likely to not be affected by any sort of display side scaling.

The "scaling mode" option isn't a workaround to a problem, but a feature you can use and configure for when you want a different scaling mode than what the monitor offers.
Comment 19 thomas.lassdiesonnerein 2019-03-15 07:54:54 UTC
Are you telling me, that the new code is better for most other users, and at the same time works worse in my case, so you won't fix it? Then I appreciate that you want me to understand this first and I will try now. Is the following understanding correct?

Kernel <= 4.14: Scaling is ON by default. If users want a custom refresh rate they would need switch scaling mode OFF via xorg.conf, but then get black bars.

Kernel > 4.14: If users want full screen instead of black bars, users have to switch scaling mode ON via xorg.conf


Assessment:
Both are not intuitive. They relay on configuration file instead of a GUI option in plain sight. Also: The black bars are more disturbing to the average guy, than not to be able to set a custom refresh rate. 

Idea:
Why does the driver not Scale ON by default AND sets the refresh rate of the preferred mode?


I know I could be completely wrong here.
Comment 20 Nicholas Kazlauskas 2019-03-15 13:07:46 UTC
Not every user will want the same scaling parameters for non-native signals. The display itself will often offer a configurable mode itself for its scaler for this reason.

I don't have the exact displays that you have, but I have a test setup with a 4096x2160 and 3840x2160 display and I'm able to mirror the desktop using Plasma and have the image fullscreen on both displays (without black bars) because the 4096x2160 display defaults to fullscreen scaling.

But I can also reproduce your problem by changing the scaling mode in the display's OSD to 1:1 scaling, which shows black bars. This is because the driver doesn't do any scaling itself by default.

The driver can't query the monitor to understand what the display is going to do, and it also can't guess what the user's intended preference is here.

So we expose the standard (ie. shared across other drivers as well) DRM property for "scaling mode" to give more control over the behavior when possible. As for why it's not configurable via GUI, that's more of a userspace feature request for the Plasma / GNOME / etc developers.
Comment 21 thomas.lassdiesonnerein 2019-03-15 13:43:03 UTC
Ok thanks again. You can close this as "won't fix" then.

My last and final question. Why does the old code (kernel 4.14) work?
Comment 22 thomas.lassdiesonnerein 2019-03-18 13:45:51 UTC
I investigated further and the culprit seems to be that kernel =4.14 ignores what xrandr *wrongly* detects and sets the connected TV correctly to its native 3840x2160@60hz without black bars. Xrandr detects only 30hz in both 4k resolutions on all kernels. Kernel 4.14 still manages somehow to give me 3840x2160@60hz without black bars. So could this be a xrandr-bug?

Kernel >4.14 seems to respect xrandr and sets: 4096x2160@30hz without black bars - maybe my HDMI2.0-to-DP adapter is reporting this setting.
I also can get 3840x2160@30hz without black bars with your xrandr workaround, but not 60hz.


In more detail:

kernel =4.14
-------------
xrandr --props reports that scale-mode is "None". But still no black bars, because the *native* resolution of the TV is 3840x2160 (listed in the panel specs here: https://www.lg.com/us/support-product/lg-OLED65B7P)

So if I change to 4096er resolution the desktop is bigger than the screen.
Also I have 60hz on both displays, while xrandr *wrongly* reports only 30hz available on the TV. The difference between 60 and 30hz is very visible when moving the mouse and e.g. supertuxkart confirms 60hz in the FPS-overlay.


kernel >4.14
--------------
The TV sets only 30hz, on 4096 and 3840 res. When I force a 60hz-modeline via xorg.conf.d like described here https://wiki.archlinux.org/index.php/xrandr#Adding_undetected_resolutions I get no signal or a switch back to 30hz, when cloning in the GUI.

Then the 3840-black-bar workaround only works if I create this file: 

cat /etc/xdg/autostart/Scaling.desktop

[Desktop Entry]
Type=Application
Name=Scaling
Exec=xrandr --output DisplayPort-0  --set "scaling mode" "Full"


Putting it in xorg.conf had no effect. This is what my xorg.conf now looks like. Note that the VRR Option works as expected, so its not as if the file is completely ignored:


cat /etc/X11/xorg.conf.d/10-monitor.conf 

Section "Monitor"
    Identifier "DisplayPort-0"
    Option "scaling mode" "Full" 
    Option "PreferredMode" "3840x2160"
EndSection

Section "Screen"
   Identifier "Screen0"
  Device "AMD"
EndSection

Section "Device"
    Identifier "AMD"
    Driver "amdgpu"
  Option "DRI"         "3"
    Option "TearFree"    "true"
   Option "VariableRefresh" "true"
EndSection

Note You need to log in before you can comment on or make changes to this bug.