Despite different claims by ATI in 2016 Radeon R5 230 graphics cards featuring HDMI, DVI and VGA had originally been sold as 4K-ready up to the year 2015. The only prove I have for this is a comercial invoice from Nexus Mobile and a packaging card. As far as I know Radeon R5 230 cards currently sold can still be run stable and reliable under UltraHD provided that you apply the right kernel patch (see for the attachement). Unfortunately neither my former merchant nor ATI have responded about my questions concerning this change in trade and marketing policy.
My request would now be to integrate the provided patch into the mainline kernel. It does not change the behaviour of the radeon driver unless you specify a nonzero value for the radeon.hdmimhz parameter. If you do I have tested the R5 230 cards I have to run stable and reliably for days. At least the 2GB variant of this card has largely sufficient resources for proficient desktop computing under UltraHD including image manipulation in 3840x2160.
While never officially discussed for the radeon driver nouveau is already implementing a similar parameter called nouveau.hdmimhz since kernel 4.5.x. Though it thereby becomes possible to specify a hdmimhz that is far above the cards technical possibilties the nouveau developers I have talked with say that it would rarely be possible to damage any card by overcloking the TMDS. In deed I have successfully been overclocking my GeForce 9600M GT to feature 4K/2160p.
Even specifying values considerable higher than 225MHz did not damage my GeForce 9600M GT though the screen stayed black upon the nouveau driver initialization. While the Radeon R5 230 works well at 297MHz (as long as you specify that via radeon.hdmimhz) I have similarly to the GeForce 9600M GT tried to overclock a Radeon R7 240. It did produce stable images at a higher hdmimhz like 330 though the HDMI input of my monitor features no more than 30Hz at 3840x2160 (tested with or without a DP-adapter).
While it remains questionable if the provided patch can improve things for newer Radeon cards I would believe it to be beneficial for some elder cards. At least it is known to be beneficial for the R5 230 initially marketed as 4K-ready. The according radeon patch provided with this report has so far already been accepted by the Mageia 6 distribution. Though the attached patch is for application at the current 4.8.0-rcX+ kernels most of my machines that rely on it still run with 4.6.0.
Created attachment 239371 [details]
shipment notification: R5 230 marketed as 4K-ready
Created attachment 239381 [details]
patch introducing radeon.hdmimhz for kernel 4.8.0-rc2
NACK. I rejected this the last time you brought this up. You are running the hw outside of it's validated specs. See this discussion:
It is not true that this card would only feature a TMDS of 297 over DP. The card I have bought does only have HDMI, DVI and 4K and it was sold as 4K-ready as you can see from the first attachement.
... and VGA
The asic supports 4K over DP or duallink DVI.
It obviously works stable over HDMI as well. Besides this I have just tested an ATI Mobility Radeon HD 2600 XT/2700 and it can provide 3840x2160_22.00 over its HDMI port with a radeon.hdmimhz of 250. With a hdmimhz of 297 I get some screen distortions but that does not harm the card.
Why not just unlock a new feature of so many radeon cards? Nouveau developers did so for long with their device driver.
Running the hardware out of spec is not something we want to support.
any other opinion by someone else on this issue?
Driving the PLL and transmitter way over it's limit is clearly not a good idea and can potentially cause hardware failure in the long term.
For the R5 230 long time experience is already available. I am successfully using this card at least since February 2016 at a TMDS of 297Mhz and these cards are doing very well on basis of everyday use as well as occasional full throttle.
Today I approached to try out the dual-link feature of the DVI-port. First I had verified with AOC that my u2868pqu monitor in deed supports dual-link over DVI. Then I connected the DVI output of my R5 230 (comment #6) with the u2868pqu monitor over DVI. Trying to boot with a vanilla/hdmimhz=0 I just ended up with a black screen. Finally I did succeed to get a picture by setting hdmimhz=165. Firstly amazed I then noticed that hdmimhz does also disable the duallink feature which was obviously the cause why it worked with hdmimhz. My explanation for this is that the radeon driver detected both the display and the card to support dual-link DVI. However the cable does not.
(In reply to Elmar Stellnberger from comment #9)
> any other opinion by someone else on this issue?
FWIW enthusiasts love such things, but corporations do not. Hence you don't see overclocking and similar unspecified stuff in drivers not maintained by community generally.
Personally I've always thought the risk of damaging hardware with any kind of overclocking is just about exactly zero as long as you don't increase voltage levels (and you can handle the additional heat but that should be a non-issue here). That's my limited understanding of the physics behind it :-).
I suspect part of the reason why it overclocks so well is also that this chip should be DP 1.2 capable - meaning HBR2 mode which has a clock of 270Mhz (not that you can really get cards with that chip which actually do have the DP port...). Now the signalling is different with DP vs. HDMI/DVI but if I had to take a guess I'd suspect the hw is mostly all the same.
But none of that is going to change the opinion of overclocking of anyone...
(In reply to Roland Scheidegger from comment #13)
> Personally I've always thought the risk of damaging hardware with any kind
> of overclocking is just about exactly zero as long as you don't increase
> voltage levels
Unfortunately this is exactly what happens here. The clock is generated by a voltage controlled oscillator and for the desired resolution you need to over clock it by about 30-40%.
That in turn means you raise the voltage way over the nominal limit.
Those oscillators are designed to handle voltages about 250% over the nominal level without frying immediately, but that says absolutely nothing about the aging of the circuit under those conditions.
The PLL we are talking about here clearly isn't designed for that level of operation and even the closed source driver (which are otherwise rather friendly to overclocking) don't let the user override this absolute limit.
So this is a clearly NAK from my side.
(In reply to Christian König from comment #14)
> (In reply to Roland Scheidegger from comment #13)
> > Personally I've always thought the risk of damaging hardware with any kind
> > of overclocking is just about exactly zero as long as you don't increase
> > voltage levels
> Unfortunately this is exactly what happens here. The clock is generated by a
> voltage controlled oscillator and for the desired resolution you need to
> over clock it by about 30-40%.
> That in turn means you raise the voltage way over the nominal limit.
Oh interesting - didn't know voltage was directly tied to clock frequency here. Makes sense then to not allow it (at least if that circuitry isn't shared with DP, as the DP link runs at much higher clock (540Mhz actually), but I suppose it's really different there).
(In reply to Roland Scheidegger from comment #15)
> Oh interesting - didn't know voltage was directly tied to clock frequency
> here. Makes sense then to not allow it (at least if that circuitry isn't
> shared with DP, as the DP link runs at much higher clock (540Mhz actually),
> but I suppose it's really different there).
The voltage is only indirectly related, but yes over clocking such parts is a rather bad idea in the long term.
DP uses a fixed frequency which is way easier to generate than the variable HDMI/DVI/VGA clock.
The patch does not only allow overclocking; it can also be used to disable the duallink feature. In certain cases this may be necessary to get a picture at all (see comment 12).
I've just tried this patch on Radeon HD 6850 and it works great (2560x1080 at 185.58Mhz). While indeed it's more than AMD site says is supported, official AMD drivers for Windows support this resolution out-of-the-box, so I think it's safe to at least increase the limit to 186 Mhz.
I read all the previous comments - for and against the adoption of the patch.
Question: Why is it easy to find a pixel clock patch for WINDOWS (www.monitortests.com AMD/ATI Pixel Clock Patcher by someone named 'ToastyX') -available and supported since 2012 - and successfully running 4K screens (since January 2016) on the same Radeon cards being discussed here ? Many people on that webpage discussing it. Seems to work. I personally tested on a few older Radeon cards and it works at 3840x2160 for me. I have NOT yet run it for hours (I rarely run Windows, and then only to test of fix PC's for others). I have not connected temperature sensors to heat sinks yet.
If the pixel clock generator circuitry is on the same die as everything else, then it shares a heat sink. The whole thing designed such that with recommended airflow across that heat sink, the GPU remains functional. BUT, that can mean running 3 separate displays - utilizing the full capacity of the GPU.
IF this patch from Elmar Stellnberger were to be used to run a SINGLE 4K LCD at 3840 x 2160 at 30% overclock on the pixel clock generator, maybe the overall GPU would be generating much less than maximum heat, and the heatsink/fan could easily keep it cool.
Does anyone have a maximum pixel clock specification for the various pixel clock generator designs on the various ATI/AMD dies ?
Did ATI set limits due to HDMI cables and overall ability of heatsink to dissipate the heat when running the GPU at max speed / load on 3 screens ?
Is there a listed maximum voltage that the PLL can run at - long term - without damage ? for most integrated circuits data sheets I have ever read, there is a relationship between max speed and temperature of the die.
also, the patch allowing pixel clock increase for Nvidia cards has been in kernel since 4.5.n - also something I successfully tested recently at 3840x2160 resolution on OLD Nvidia card. I thought Linux was the OS that is supposed to allow experiments and variety & versatility. Seems like NOT if talking Radeon cards.