Are lower resolution monitors safer for gaming on the long run?

1

EDIT: My earlier question was too subjective, so I've rephrased it entirely to broaden out its perspective.

In my opinion, PC games are always targeted to run best on the latest-gen graphic cards, unlike consoles games, which are built to run on the same hardware years after the hardware has been released.

Would choosing a 1920x1080 monitor over a 1600x900 monitor kill my chances of a better frame rate in games a couple of years down the line? Should I decide in favour of the lower-res option for the safety of gaming on the long run?

SNag

Posted 2013-05-17T16:41:09.340

Reputation: 773

Please atleast explain why if you're going to downvote this question! – SNag – 2013-05-17T16:48:19.750

2You realize you can render the game at a lower resolution, right? – SaintWacko – 2013-05-17T16:49:01.143

This question will probably be closed since it's rather subjective, but I would go with option 2. While every graphics card will get higher fps on lower resolutions, my 560Ti is running 1080p on every game without any noticeable issues. I run with the graphics mostly maxed and don't have any problems. I would bet that your 660Ti will keep ahead of software for most of its useable life. – None – 2013-05-17T16:49:38.603

@SaintWacko: Yes I do, but it is common knowledge that non-native resolutions actually makes the game look poorer. I wouldn't want that. – SNag – 2013-05-17T16:51:50.660

This has less to do with monitor resolution, and more with graphics card lifespan due to resolution rendering. – fbueckert – 2013-05-17T17:01:20.417

@fbueckert: Assuming equal usage of the same graphic card on two identical setups, except for the monitor (one higher-res, the other lower-res), which setup would continue to offer better outputs over time? I'm inclined to believe it is the latter, but I'm still confused; which is why this question. – SNag – 2013-05-17T17:06:42.140

That has barely anything to do with monitors; your graphics card doesn't care which monitor is hooked up to it. All it cares about is the resolution it has to output at. 1600900 = 1.44 million pixels. 19801080 = 2.14 million pixels. The lower resolution will last longer because there's less computation required to render the same picture. – fbueckert – 2013-05-17T17:12:31.287

Assuming you're rendering the same resolution on monitors with different resolution; There is no noticeable difference in performance. For modern hardware, upscaling graphics is trivial and might only cost you 1 FPS (out of maybe 100-200). This - of course - depends on how far you are going to upscale; unless you decide to let the monitor handle the upscaling, in which case the GPU has nothing to do, except render the frames. – Nolonar – 2013-05-17T17:13:43.120

1If anything, this is a question for Super User. – Origami Robot – 2013-05-17T17:15:26.067

1@OrigamiRobot: Thanks! Can I have it migrated? – SNag – 2013-05-17T17:24:44.637

Answers

0

I think safety is the wrong word here. You are looking for "long term use."

The monitor isnt the issue here, the graphics card is. If you have the ability to get a higher resolution monitor now, there is no reason not to. In the future, if you find your graphics card is not putting out the FPS you want for a game you have a few options:

  1. Replace the graphics card with a better one
  2. Lower the resolution the graphics card is putting out
  3. Reduce the in game graphics options (textures, anit-aliasing, particles, etc)

Keltari

Posted 2013-05-17T16:41:09.340

Reputation: 57 019