1
I have bought recently AMD RX 570 card and because of coil whine I have later replaced it with nVidia GTX 1060. Both were connected using HDMI to 23" ASUS VX239H IPS monitor.
I was surprised how much the image under the same settings is different on both cards. On desktop the colors of AMD are more livid (even after nVidia fix Output dynamic range = full), meanwhile on nVidia the image was sharper.
I have tested it using couple of games, namely Rainbow Six Siege with ultra HD textures settings on Ultra, exactly the same settings on both cards.
However there were substantial differences in the resulting image. The wired fence observed from further away in the first training mission was flashing on AMD card but on nVidia it dissapeared (was just invisible from far away).
I compared also image in Crysis 3, and after switching from AMD to nVidia the in game image got more sharp and more still without additional flashing of more distant or small objects.
Is this because of some driver settings (like global settings in Crimson driver) or can rendered scenes in DirectX games with exactly same settings differ depending on the GPU chip used?
I have already returned RX 570, but I have maximized global settings there in Crimson driver. Can this be achieved by nVidia to get comparsion? – Vojtěch Dohnal – 2017-05-10T11:09:07.857
1Yes, set everything to max quality from the nV driver. – Overmind – 2017-05-10T11:54:26.410
Tried that, but never was able to get the fence flashing in Ranbow 6, like on AMD card, but Crysis 3 got spoiled the same way maximizing global settings on nVidia, so you were probably right. – Vojtěch Dohnal – 2017-05-11T18:04:41.733