8
I have a desktop machine with an ATI VGA/DVI AGP card driving two monitors. Everything is peachy until I connect to this machine via a remote desktop session. After a remote desktop session, I return to my desk to find one monitor completely off, and the second monitor shows a black screen and will display the cursor if I move my mouse over there.
I'm usually successful in logging in blindly, but it is getting more difficult now that my company has instituted a disclaimer screen after you login, that requires you to press OK before continuing.
I'm not sure if it is related to the problem but Windows 7 defaulted to think the monitor on the DVI card (display #1) was the primary, and the VGA (display #2) the secondary. I changed that default and set display #2 as the primary. It seems as if Windows is still outputting the primary screen to display #2 (i.e. the VGA monitor), however it isn't powering on the VGA port.
It is probably a bug given I'm running the RC of Windows 7, but just looking for ideas on workarounds.
I am using auto login to get around this issue. – Sahil Singh – 2017-01-30T06:06:45.360
Darn all I can find are posts that tell you how to change the background... – Ivo Flipse – 2009-08-13T13:50:55.080
I'm also seeing the same issue with Win 7 RC. I also notice that all apps that I used over remote appears on screen2 after I log in. My assumption is that when you use remote desktop it uses the screen that most closely match the resolution given when you connect and thus moving everything to your secondary screen. – Paxxi – 2009-08-13T13:51:49.323