Prevent screen and desktop blackout when attached display powered on?

1

I have a custom home theater PC setup with the TV (a Sharp LCD) connected as monitor 3 via HDMI to my Windows 7 PC. I use an ATI/AMD 6770 graphics card with support for 3 monitors... I have two touchscreen monitors connected (via DVI and a second HDMI port) to the same card. Windows is configured to span the desktop over all 3 displays.

When I power on the TV the display on both monitor 1 and monitor 2 goes black for a couple of seconds. We'll call this blackout 1. Then it comes back for a few seconds (presumably as the TV warms up), and then goes black again (blackout 2) for a few more seconds. Everything is fine after that.

I believe blackout 1 is caused by the hardware on the graphics card detecting a new signal and reconfiguring resources internally. I believe blackout 2 is caused by Windows detecting a new display as reported by the graphics drivers and reconfiguring the Windows desktop. I don't have evidence of this, but it makes logical sense.

How can I eliminate one or both of these blackout periods?

The two touchscreen monitors are control interfaces for the HTPC. It is incredibly annoying to have the entire HTPC UI go black not once but twice whenever the TV is powered on. I am open to any suggestions, including using a separate graphics card for the TV or not spanning the desktop or even switching to NVidia hardware if that might make a difference.

I should clarify that I think both the graphics card and Windows are working as intended: these blackouts are probably fine in most environments. My particular scenario however really depends on no interruption in the video feed to the two touchscreens.

Gene Goykhman

Posted 2013-06-28T19:22:24.497

Reputation: 198

I have this problem too, my htpc has a monitor and a tv connected to it. If I`m working on the htpc and my girlfriend turns on the tv, my monitor blacks out because windows detects that the tv is powered on. Terribly annoying. Will try your advice on using DVI output of video card for the TV, hopefully it works for me too. – Leo – 2017-08-06T14:34:15.723

You can't. Your TV requires itself to warmup. Both problems are likely caused by the TV itself. – Ramhound – 2013-06-28T20:43:36.640

Answers

1

I was able to find an acceptable workaround. By plugging the HDMI cable from the TV into a HDMI-DVI adapter and then plugging the DVI adapter into the PC, both of the blackout periods are gone when the TV is turned on or off.

I found this thread helpful in steering me towards the difference in display detection mechanisms in HDMI/DisplayPort vs. DVI ports:

http://social.technet.microsoft.com/Forums/windows/en-US/8a9b5aa7-fe33-4e6d-b39b-8ac80a21fdc2/disable-monitor-off-detection-how

There were a number of other suggestions in that thread that I did not try, such as actually clipping pin 16 on the DVI adapter but this didn't seem to be necessary for me. As long as I am using the DVI output from my graphics card, the issue appears to be resolved.

Gene Goykhman

Posted 2013-06-28T19:22:24.497

Reputation: 198

1

I agree this is working as intended. The graphics card and driver detect a new device - the TV - and it has to detect the TVs settings. Since all these devices are handled by the same graphics card, the devices "blackout" during the detection and configuration.

Unfortunately, I think the solution would be to move the TV to a different graphics card. That way, the two LCD touchscreens are not affected by the hardware detection of the TV. I cant say this will work for sure, but it would be an easy test.

Keltari

Posted 2013-06-28T19:22:24.497

Reputation: 57 019

0

I had a problem once with drivers that caused problem like this. Maybe a bad answer but have tried other drivers?

And if you have the latest one, have you tried out an older version? (I had to use old drivers to get my screens working without those delays)

Carl Abrahamsson

Posted 2013-06-28T19:22:24.497

Reputation: 177