0
I recently read that the only difference between DVI and VGA is that DVI is digital whereas VGA is an analog signal. This means it might gain noise along the way or if the cable is cheap. I currently convert from DVI (the only output my GPU gives) to VGA directly out of the GPU, and then pass the signal to my two screens through a VGA cable. The cable itself is not too long, but the screens are big and that might cause some trouble. Would it be wiser to use DVI cables and converting to VGA (the only input my monitors can take) right before the monitor?
Does it look okay? Then don't bother changing anything. – Shinrai – 2011-09-27T18:30:24.647
3Your not converting the signal in any configuration. The DVI connector on your GPU includes the ability to output an analog signal and does so when you connect to a VGA monitor. – Brian – 2011-09-27T18:44:51.807
@Shinrai I can't really tell since this is how it's always been. – Gabriele Cirulli – 2011-09-27T18:51:35.350
@Brian Are you sure? I've used 2 converters which connect to the DVI out and have a connector for a VGA cable. – Gabriele Cirulli – 2011-09-27T18:52:47.033
1@GabrieleCirulli - What's he's saying is that most DVI already carries an analog VGA signal in addition to the digital one, it doesn't have to be converted. This is not ALWAYS the case but ALMOST always the case. This is why the converters (really just adapters) are small and don't require external power. Your adapters could POTENTIALLY introduce a signal strength issue but it's incredibly unlikely; hence my initial comment. You're very unlikely to see a difference. – Shinrai – 2011-09-27T19:24:24.210