Bug 105331
Summary: | Dithering doesn't work on HD4600 when using DP | ||
---|---|---|---|
Product: | Drivers | Reporter: | Dihan Wickremasuriya (nayomal) |
Component: | Video(DRI - Intel) | Assignee: | intel-gfx-bugs (intel-gfx-bugs) |
Status: | RESOLVED CODE_FIX | ||
Severity: | normal | CC: | intel-gfx-bugs, mario.kleiner, ville.syrjala |
Priority: | P3 | ||
Hardware: | x86-64 | ||
OS: | Linux | ||
Kernel Version: | 4.2.0 | Subsystem: | |
Regression: | No | Bisected commit-id: | |
Attachments: |
dmesg output (4.2.0)
xrandr output (4.2.0) intel_reg_dumper output VBIOS dump |
Description
Dihan Wickremasuriya
2015-10-02 00:07:46 UTC
Created attachment 189241 [details]
xrandr output (4.2.0)
Created attachment 189251 [details]
intel_reg_dumper output
Created attachment 189261 [details]
VBIOS dump
Ville, why do we default to 24 bpp instead of 18 bpp when the EDID has no bpc information? No idea. The DP spec says: "In determining the colorimetry format, the Source device must check the capability of the Sink device via an EDID read. When the Sink device capability is unknown, for example due to the corruption of EDID, the Source device must fall back to 18bpp RGB, with full dynamic range" ... "Note: The Source device falls back to 18bpp,VESA range RGB when the sink capability is unknown." I guess part of the problem is that the part that does the limiting now is agnostic to connector type. HDMI might have different rules? Dihan, sorry for the delay, please try http://patchwork.freedesktop.org/patch/msgid/1452695720-7076-1-git-send-email-jani.nikula@intel.com Thanks for the fix Jani, it works great! I tested on top of 4.4.0 which exhibits the same symptom without your patch. (In reply to Dihan Wickremasuriya from comment #8) > Thanks for the fix Jani, it works great! I tested on top of 4.4.0 which > exhibits the same symptom without your patch. Thanks for testing, the fix has been pushed to drm-intel-next-queued with cc: stable so it will eventually get backported to stable kernels. commit 013dd9e038723bbd2aa67be51847384b75be8253 Author: Jani Nikula <jani.nikula@intel.com> Date: Wed Jan 13 16:35:20 2016 +0200 drm/i915/dp: fall back to 18 bpp when sink capability is unknown I haven't tested this yet, but doesn't this change degrade any DVI or analog VGA display to 6 bpc/18 bpp if it is connected to DisplayPort via a DP->DVI or DP->VGA adapter? That would be pretty disastrous to many medical/scientific vision testing applications that need at least 8 bpc without dithering, and often have to rely on specialized DVI/VGA display hardware, given that many machines nowadays only have DP outputs, but no real DVI or VGA connectors anymore. A similar problem was fixed in e8fa4270536de2e5e8205fb2b90bb26afc471729 "drm/i915: Only dither on 6bpc panels" just a couple of months ago and i'm worried this reintroduces the problem for DP? (In reply to Mario Kleiner from comment #10) > I haven't tested this yet, but doesn't this change degrade any DVI or analog > VGA display to 6 bpc/18 bpp if it is connected to DisplayPort via a DP->DVI > or DP->VGA adapter? That depends on whether the displays have EDID 1.4 and advertise a sink capability higher than 18 bpp or not. I presume the spec was written this way to avoid black screens, and to a large part that's the main priority we have too. Black screens are not good, but degrading lots of well working displays out there isn't either. And 6 bpc for VGA and DVI sinks can not be right. I think our general EDID handling needs a little bit of improvement for EDID < 1.4, or your patch would need to be more selective wrt. what kind of sink is connected to DP, or we need some manual override, otherwise this will be pretty bad for lots of scientific users. It will practically brick their systems for the purpose for which they bought them if they depend on DP->DVI/VGA output, which many do. The patch has reached stable distro via kernel updates. CRT monitors or other analog VGA driven displays have essentially infinite resolution due to the analog input signal - only limited by the DAC resolution at the source. Currently our drm_add_display_info() routine in drm_edid.c assumes "unknown bpc" for any analog source, with the code comment "driver figures it out in this case". I'd assume driving the VGA sink with the highest bpc the hw can deliver should be safe, so assigning a large bpc, e.g., >= 16 bits would deal with analog sinks. I also googled around a bit and found pdf's with descriptions of different EDID versions, the DVI spec and its predecessor, the DFP 1.x spec. As far as my understanding of EDID 1.3 from that goes, for a digital sink, if bit 0 of byte 0x14 is set to 1 (or for us: if (edid->input & DRM_EDID_DIGITAL_TYPE_DVI) ) then the sink is a "DFP 1.x compatible TMDS" and that seems to mean that the display can take 8 bpc / 24 bpp input signals. From https://www.tu-chemnitz.de/informatik/RA/news/stack/kompendium/vortraege_99/peripherie/standards/dfp/DFP.pdf section 3.10 "EDID support": "If the DFP monitor only supports EDID 1.X (1.1, 1.2, etc.) without extensions, the host will make the following assumptions: 1. 24-bit MSB-aligned RGB TFT 2. DE polarity is active high 3. H and V syncs are active high 4. Established CRT timings will be used 5. Dithering will not be enabled on the host " The DVI spec has similar wording wrt. DVI signals being compatible with DFP sinks and 24 bits per pixel input signal support as mandatory. My proposal would be a patch, also for stable, to assign info->bpc = 16 for analog sinks instead of 0 and info->bpc = 8 if ((info->bpc == 0) && (edid->input & DRM_EDID_DIGITAL_TYPE_DVI)) at the end of EDID 1.3 parsing. |