Why does 1080p through a VGA cable fit my HDTV but is oversized when through an HDMI cable?

6

2

I have put together a new PC with a XFX GeForce GTX 260 graphics card and have it connected to my HDTV.

First, I used an old VGA cable with a DVI to VGA adapter and plugged it in to my HDTV's VGA port. Running at 1920x1080 it fit the screen perfectly.

Now, to avoid running another cable across the room, I have connected it with a DVI to HDMI cable to my TV's HDMI port, and the desktop at 1920x1080 is cropped by the edge of the screen.

I have "fixed" the cropping by using NVIDIA's "Adjust desktop size and position" tool, which created a screen resolution of 1814x1022 to fit the screen, but this is no longer the TV's native resolution and confuses some software (e.g. WoW).

Why does VGA work as expected, but HDMI is scaled up? Can it be avoided?

GraemeF

Posted 2009-10-15T21:16:50.947

Reputation: 187

I have experienced this same issue using a GTX 275, DVI-HDMI Cable on my Samsung Series 4 LCD 1080p TV. – Nick Josevski – 2009-10-15T22:40:37.330

Same thing happens on my Samsung Series 6 with an ATI Radeon X1200. I have never heard of "overscan" before, but I'll definitely check it out. – Cᴏʀʏ – 2009-10-16T12:58:01.427

Ha, just did a bit o' research. The newer Samsung Series 7 HDTVs have an auto-compensation feature for overscan... great. – Cᴏʀʏ – 2009-10-16T13:06:52.463

Answers

9

What you're seeing is called overscan. TVs have it as broadcasts in NTSC format sometimes did not fill the entire space on the screen. You'll need to look through your menus for a setting to adjust the overscan amount to 0%. You may even have to find a way to get into your TV's service menu to make this adjustment.

jasonh

Posted 2009-10-15T21:16:50.947

Reputation: 2 967

But why would that be different with an HDMI connection vs. VGA connection? – GraemeF – 2009-10-16T08:32:23.667

3This is just a guess. The manufacturers put the VGA port on the TV specifically for computers which won't need the overscan adjustments. So they're auto-disabled on that input. But they don't expect most consumers to be connecting their computers over DVI/HDMI quite yet, so the overscan defaults are left in place. – Ryan Bolger – 2009-10-16T18:15:43.960

@Ryan: That's correct. – jasonh – 2009-10-19T22:31:42.653

2

You don't mention what TV it is but you probably need to set the mode on the TV to 'Just Scan' or something similar under the picture menu.

JRT

Posted 2009-10-15T21:16:50.947

Reputation: 633

1

We had blurry edges, found the VGA worked better than the DVI/HDMI but had to use a better, more insulated cable shorter too.

The Nvidia cards are SLId so I don't know if that makes any bit of a difference, the TV is on just scan, but menus on the TV weren't a problem, it was in the computer. We found that the Samsung 52 is only represented as a 46. Nvidia has a scale setting for this in the control panel, and solved everything.

one

Posted 2009-10-15T21:16:50.947

Reputation: 11

1

Try this: You are dealing with two different things.

  • The source video is at a specific resolution.
  • The TV or target video is at a specific resolution.

If you are sending from a computer try adjusting the video to 1080p, and then then set the resolution on the TV to 1080p. Matching the resolutions should give you the best results.

I have found that when there is a conversion process from HDMI to DVI that this may cause quality issues. Something is lost in the conversion process and the image quality was lacking. This is what I saw in my actual use with a converter, that when I went to a new computer with HDMI on the motherboard that the video quality had like a 100% improvement. However, my old motherboard only had 720p for integrated video.

Also try updating chipset or video card drivers.

Charles

Posted 2009-10-15T21:16:50.947

Reputation: 11

1

I had the opposite issue with my Samsung and an ATI card, it was shown with black bars around the edge by default whilst running in native 1920x1080 - TV was set to 'just scan'.

I fixed this in the Catalyst Control Centre - there was an underscan option that by default was set to allow TV's such as yours to display the full picture - this scaled the image without changing the signal sent to the TV (i.e. it was still full HD and a standard resolution).

Not sure if nVidia has something similar though...

As for 'why different on VGA and HDMI' - I'm guessing this is the same reason as most standard displays - VGA is scaled based on the furthest active pixels in each direction (e.g. 'Auto adjusting' on most monitors) whereas HDMI contains resolution information which means no trial and error scaling. If the question was why different on DVI vs HDMI, I'd be stumped :)

Richard Mansfield

Posted 2009-10-15T21:16:50.947

Reputation: