Bug 172421 - radeon: allow to set the TMDS frequency by a special kernel parameter
Summary: radeon: allow to set the TMDS frequency by a special kernel parameter
Status: NEW
Alias: None
Product: Drivers
Classification: Unclassified
Component: Video(DRI - non Intel) (show other bugs)
Hardware: All Linux
: P1 enhancement
Assignee: drivers_video-dri
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2016-09-21 18:15 UTC by Elmar Stellnberger
Modified: 2023-04-04 08:21 UTC (History)
9 users (show)

See Also:
Kernel Version: 4.8.0-rc7+
Subsystem:
Regression: No
Bisected commit-id:


Attachments
shipment notification: R5 230 marketed as 4K-ready (303.50 KB, image/jpeg)
2016-09-21 18:16 UTC, Elmar Stellnberger
Details
patch introducing radeon.hdmimhz for kernel 4.8.0-rc2 (4.70 KB, text/plain)
2016-09-21 18:17 UTC, Elmar Stellnberger
Details

Description Elmar Stellnberger 2016-09-21 18:15:16 UTC
Despite different claims by ATI in 2016 Radeon R5 230 graphics cards featuring HDMI, DVI and VGA had originally been sold as 4K-ready up to the year 2015. The only prove I have for this is a comercial invoice from Nexus Mobile and a packaging card. As far as I know Radeon R5 230 cards currently sold can still be run stable and reliable under UltraHD provided that you apply the right kernel patch (see for the attachement). Unfortunately neither my former merchant nor ATI have responded about my questions concerning this change in trade and marketing policy.
  
  My request would now be to integrate the provided patch into the mainline kernel. It does not change the behaviour of the radeon driver unless you specify a nonzero value for the radeon.hdmimhz parameter. If you do I have tested the R5 230 cards I have to run stable and reliably for days. At least the 2GB variant of this card has largely sufficient resources for proficient desktop computing under UltraHD including image manipulation in 3840x2160.
  
  While never officially discussed for the radeon driver nouveau is already implementing a similar parameter called nouveau.hdmimhz since kernel 4.5.x. Though it thereby becomes possible to specify a hdmimhz that is far above the cards technical possibilties the nouveau developers I have talked with say that it would rarely be possible to damage any card by overcloking the TMDS. In deed I have successfully been overclocking my GeForce 9600M GT to feature 4K/2160p. 
  
  Even specifying values considerable higher than 225MHz did not damage my GeForce 9600M GT though the screen stayed black upon the nouveau driver initialization. While the Radeon R5 230 works well at 297MHz (as long as you specify that via radeon.hdmimhz) I have similarly to the GeForce 9600M GT tried to overclock a Radeon R7 240. It did produce stable images at a higher hdmimhz like 330 though the HDMI input of my monitor features no more than 30Hz at 3840x2160 (tested with or without a DP-adapter).
  
  While it remains questionable if the provided patch can improve things for newer Radeon cards I would believe it to be beneficial for some elder cards. At least it is known to be beneficial for the R5 230 initially marketed as 4K-ready. The according radeon patch provided with this report has so far already been accepted by the Mageia 6 distribution. Though the attached patch is for application at the current 4.8.0-rcX+ kernels most of my machines that rely on it still run with 4.6.0.
Comment 1 Elmar Stellnberger 2016-09-21 18:16:15 UTC
Created attachment 239371 [details]
shipment notification: R5 230 marketed as 4K-ready
Comment 2 Elmar Stellnberger 2016-09-21 18:17:10 UTC
Created attachment 239381 [details]
patch introducing radeon.hdmimhz for kernel 4.8.0-rc2
Comment 3 Alex Deucher 2016-09-21 18:30:58 UTC
NACK.  I rejected this the last time you brought this up.  You are running the hw outside of it's validated specs.  See this discussion:
https://bugs.freedesktop.org/show_bug.cgi?id=93885
Comment 4 Elmar Stellnberger 2016-09-21 18:37:39 UTC
  It is not true that this card would only feature a TMDS of 297 over DP. The card I have bought does only have HDMI, DVI and 4K and it was sold as 4K-ready as you can see from the first attachement.
Comment 5 Elmar Stellnberger 2016-09-21 18:38:14 UTC
... and VGA
Comment 6 Alex Deucher 2016-09-21 18:49:04 UTC
The asic supports 4K over DP or duallink DVI.
Comment 7 Elmar Stellnberger 2016-09-21 19:59:28 UTC
  It obviously works stable over HDMI as well. Besides this I have just tested an ATI Mobility Radeon HD 2600 XT/2700 and it can provide 3840x2160_22.00 over its HDMI port with a radeon.hdmimhz of 250. With a hdmimhz of 297 I get some screen distortions but that does not harm the card.
  Why not just unlock a new feature of so many radeon cards? Nouveau developers did so for long with their device driver.
Comment 8 Alex Deucher 2016-09-21 20:16:35 UTC
Running the hardware out of spec is not something we want to support.
Comment 9 Elmar Stellnberger 2016-09-21 20:20:21 UTC
any other opinion by someone else on this issue?
Comment 10 Christian König 2016-09-22 07:07:01 UTC
Driving the PLL and transmitter way over it's limit is clearly not a good idea and can potentially cause hardware failure in the long term.
Comment 11 Elmar Stellnberger 2016-09-22 07:24:17 UTC
  For the R5 230 long time experience is already available. I am successfully using this card at least since February 2016 at a TMDS of 297Mhz and these cards are doing very well on basis of everyday use as well as occasional full throttle.
Comment 12 Elmar Stellnberger 2016-09-22 13:51:32 UTC
  Today I approached to try out the dual-link feature of the DVI-port. First I had verified with AOC that my u2868pqu monitor in deed supports dual-link over DVI. Then I connected the DVI output of my R5 230 (comment #6) with the u2868pqu monitor over DVI. Trying to boot with a vanilla/hdmimhz=0 I just ended up with a black screen. Finally I did succeed to get a picture by setting hdmimhz=165. Firstly amazed I then noticed that hdmimhz does also disable the duallink feature which was obviously the cause why it worked with hdmimhz. My explanation for this is that the radeon driver detected both the display and the card to support dual-link DVI. However the cable does not.
Comment 13 Roland Scheidegger 2016-09-23 02:53:11 UTC
(In reply to Elmar Stellnberger from comment #9)
> any other opinion by someone else on this issue?

FWIW enthusiasts love such things, but corporations do not. Hence you don't see overclocking and similar unspecified stuff in drivers not maintained by community generally.
Personally I've always thought the risk of damaging hardware with any kind of overclocking is just about exactly zero as long as you don't increase voltage levels (and you can handle the additional heat but that should be a non-issue here). That's my limited understanding of the physics behind it :-).
I suspect part of the reason why it overclocks so well is also that this chip should be DP 1.2 capable - meaning HBR2 mode which has a clock of 270Mhz (not that you can really get cards with that chip which actually do have the DP port...). Now the signalling is different with DP vs. HDMI/DVI but if I had to take a guess I'd suspect the hw is mostly all the same.
But none of that is going to change the opinion of overclocking of anyone...
Comment 14 Christian König 2016-09-23 07:36:15 UTC
(In reply to Roland Scheidegger from comment #13)
> Personally I've always thought the risk of damaging hardware with any kind
> of overclocking is just about exactly zero as long as you don't increase
> voltage levels

Unfortunately this is exactly what happens here. The clock is generated by a voltage controlled oscillator and for the desired resolution you need to over clock it by about 30-40%.

That in turn means you raise the voltage way over the nominal limit.

Those oscillators are designed to handle voltages about 250% over the nominal level without frying immediately, but that says absolutely nothing about the aging of the circuit under those conditions.

The PLL we are talking about here clearly isn't designed for that level of operation and even the closed source driver (which are otherwise rather friendly to overclocking) don't let the user override this absolute limit.

So this is a clearly NAK from my side.
Comment 15 Roland Scheidegger 2016-09-23 16:22:25 UTC
(In reply to Christian König from comment #14)
> (In reply to Roland Scheidegger from comment #13)
> > Personally I've always thought the risk of damaging hardware with any kind
> > of overclocking is just about exactly zero as long as you don't increase
> > voltage levels
> 
> Unfortunately this is exactly what happens here. The clock is generated by a
> voltage controlled oscillator and for the desired resolution you need to
> over clock it by about 30-40%.
> 
> That in turn means you raise the voltage way over the nominal limit.

Oh interesting - didn't know voltage was directly tied to clock frequency here. Makes sense then to not allow it (at least if that circuitry isn't shared with DP, as the DP link runs at much higher clock (540Mhz actually), but I suppose it's really different there).
Comment 16 Christian König 2016-09-23 17:27:08 UTC
(In reply to Roland Scheidegger from comment #15)
> Oh interesting - didn't know voltage was directly tied to clock frequency
> here. Makes sense then to not allow it (at least if that circuitry isn't
> shared with DP, as the DP link runs at much higher clock (540Mhz actually),
> but I suppose it's really different there).

The voltage is only indirectly related, but yes over clocking such parts is a rather bad idea in the long term.

DP uses a fixed frequency which is way easier to generate than the variable HDMI/DVI/VGA clock.
Comment 17 Elmar Stellnberger 2016-09-24 18:08:32 UTC
  The patch does not only allow overclocking; it can also be used to disable the duallink feature. In certain cases this may be necessary to get a picture at all (see comment 12).
Comment 18 Paweł Zmarzły 2017-12-16 18:52:38 UTC
I've just tried this patch on Radeon HD 6850 and it works great (2560x1080 at 185.58Mhz). While indeed it's more than AMD site says is supported, official AMD drivers for Windows support this resolution out-of-the-box, so I think it's safe to at least increase the limit to 186 Mhz.
Comment 19 John 2018-11-27 03:33:29 UTC
I read all the previous comments - for and against the adoption of the patch. 
Question: Why is it easy to find a pixel clock patch for WINDOWS (www.monitortests.com AMD/ATI Pixel Clock Patcher by someone named 'ToastyX') -available and supported since 2012 - and successfully running 4K screens (since January 2016) on the same Radeon cards being discussed here ? Many people on that webpage discussing it. Seems to work. I personally tested on a few older Radeon cards and it works at 3840x2160 for me.  I have NOT yet run it for hours (I rarely run Windows, and then only to test of fix PC's for others). I have not connected temperature sensors to heat sinks yet.

If the pixel clock generator circuitry is on the same die as everything else, then it shares a heat sink.  The whole thing designed such that with recommended airflow across that heat sink, the GPU remains functional. BUT, that can mean running 3 separate displays - utilizing the full capacity of the GPU. 

IF this patch from Elmar Stellnberger were to be used to run a SINGLE 4K LCD at 3840 x 2160 at 30% overclock on the pixel clock generator, maybe the overall GPU would be generating much less than maximum heat, and the heatsink/fan could easily keep it cool.

Does anyone have a maximum pixel clock specification for the various pixel clock generator designs on the various ATI/AMD dies ? 
Did ATI set limits due to HDMI cables and overall ability of heatsink to dissipate the heat when running the GPU at max speed / load on 3 screens ?
Is there a listed maximum voltage that the PLL can run at - long term - without damage ? for most integrated circuits data sheets I have ever read, there is a relationship between max speed and temperature of the die.
Comment 20 John 2018-11-27 12:28:52 UTC
also, the patch allowing pixel clock increase for Nvidia cards has been in kernel since 4.5.n - also something I successfully tested recently at 3840x2160 resolution on OLD Nvidia card.  I thought Linux was the OS that is supposed to allow experiments and variety & versatility. Seems like NOT if talking Radeon cards.
Comment 21 Joachim Hoss 2019-02-23 12:44:46 UTC
Wiht the help of Elmar Stellenberger's patch to kernel 4.20 I can now run my old Radeon HD 5870 at 3840x1600 resolution, which it does "out of the box" under widows. 
I would also like to recommend this patch for inclusion in the kernel.
Comment 22 Elmar Stellnberger 2022-06-19 15:33:30 UTC
  The patch was not accepted because someone claimed it could damage the hardware. All lies. As years have gone by these graphics cards are still under daily use with this UHD patch and no damages have ever occured. As this hardware is still in use I would really like to see this in mainline, not just the Mageia kernel. Someone who knows about it, told it would be highly improbable that this could be detrimental for the hardware. The only fallacy would be that the screen shows black on a too high TMDS and nobody would ever continue to run the graphics card with a TMDS that yields a black screen. This hardware is near to getting phased out on many computers by now and I´d regard it as ridiculous if the patch is withheld because of the argument that it could damage hardware on the long run. This hardware will get phased out long before any damage could ever occur.
  As distributors and kernel developers still support Pentium IV hardware I am reopening this bug. This is about Core 2 aged hardware which is totally sufficient for UHD/desktop computers. I would never give up my Xi3650 machines because they are ultimately silent, something you don´t get with a newer computer. And unfortunately the Xi3650 does not work with newer 3D gaming UHD graphics cards (these cards inhibt s2ram also on Windows, which isn´t what you want for a desktop system)
Comment 23 Matt Weiland 2022-06-19 15:43:57 UTC
I think it a rather political issue that the patch has not been accepted up to now. So many people have reported about it being useful not only on the kernel bugzilla here and I would really welcome it if you kernel developers started to rethink your decision about it. There is still time and the patch is still being useful.
Comment 24 Eduard Bloch 2023-01-03 12:16:24 UTC
Hey guys, I am trying to port this mod to AMDGPU. The intention is, however, not to unlock the advertised features but to force an APU to accept HDMI clocks over a DVI port (with DVI-HDMI cable, the monitor expects HDMI signal).

The port itself is easy but it does not work. amdgpu.hdmimhz=... parameter has apparently no effect, Xorg keeps reporting:

[    16.107] (--) AMDGPU(0): HDMI max TMDS frequency 280000KHz

Does anyone have an idea how to tackle this?

Here is the patch: https://github.com/Code7R/linux/commits/amdgpu-custom-maxtdmsclock
Comment 25 James Hendry 2023-04-04 08:21:08 UTC
With the help of Elmar Stellenberger's patch to the kernel, I managed to make a patch to the 6.2.0 kernel that allowed me to use a DisplayPort -> HDMI adaptor for my ultrawide monitor (2560x1080@60Hz). the 165Mhz was just below what i required (I needed about 166Mhz for my monitor) so very very minimal over-clocking to go from not working at all to working perfectly. This worked natively in windows so clearly they have determined that it is acceptable, so I don't see any real risk.

Note You need to log in before you can comment on or make changes to this bug.