3
I have Sapphire ATI Radeon HD 2600 PRO graphics card in my Windows Server 2008 x64 machine. It has two outputs a VGA and a DVI. I have connected the DVI to my Dell 24" monitor with 1900x1200 resolution and it works 100%. The VGA I have connected to my second monitor a Samsung 22" with native resolution of 1680x1050. But the ATI driver and Catalyst control centre doesn't show this resolution as an option. If I choose a lower resolution like 1280x1024 it looks really bad and fuzzy. I searched in google and downloaded the powerstrip tool that allowed me to create a custom resolution of 1680x1050 and then this option shows up in Catalyst control centre and my 2nd monitor works fine now.
But I don't want to pay for an application to choose a display resolution. Why doesn't ATI show me that option by default even though it has no problem in actually supporting the display at resolution. Is there a way to get 1680x1050 resolution using ATI drivers only ?
Unfortunately that EDID option is available only for VGA. I have a similar issue with DVI - Catalyst thinks that my monitor does not support 1360x768 and enables GPU scaling which makes everything look blurred. Only on VGA the image is crisp even at 1360x768, but I have to uncheck that EDID box, or else Catalyst GPU scaling kicks in again and makes things blurry. – JustAMartin – 2015-07-31T11:43:11.550
1I had to go to Desktop and Display > Display 2 > Configure and uncheck "[ ] Use Extended Display Identification Data (EDID) or driver defaults". After unchecking, I was able to force 1680x1050 as the maximum resolution for the monitor. This added it to all the appropriate dropdowns. – Leons – 2012-03-27T14:30:10.240