2
I have a desktop computer that I remote into on a regular basis. Prior to last week, the host OS was Windows Server 2012R2. Last week, I installed Windows 7, and I immediately noticed that the computer was more sluggish over remote desktop, especially when watching YouTube videos, which skip every half second or so for the entire video. Last night, I added an SSD and doubled the RAM in the computer (things I have been wanting to do anyway) and the performance still sucks.
The current stats of the RDP host:
- Windows 7 Ultimate
- 240GB Crucial M500 SSD
- 32GB DDR2 RAM
- NVIDIA GeForce GTX 560 Ti
- Intel Xeon X5365
The client computer was the same the entire time, but here are its stats:
- Windows 7 Ultimate
- 500GB 10K HDD
- 64GB DDR3 RAM
- NVIDIA Quadro K4000
- Intel Xeon E5-2687W
The internet connection on both sides exceeds 10Mbps upload and download. (I realize latency is the main concern, but I don't know how to measure this, and regardless it was the same with Windows Server as it is with Windows 7.)
I have tried changing the RDP performance settings on the client side from "Detect" to "Modem (56kbps)" but the "YouTube Test" still fails with an unusable, low frame rate video.
As per the suggestion from another Superuser question, I have tried both netsh interface tcp set global autotuninglevel=highlyrestricted
and netsh interface tcp set global autotuninglevel=disabled
, restarting the computer each time and re-testing. I did not see any difference in the performance.
My question: Why did the performance drop? My current theory is that the cause is the change in operating system. Is there anything else I can try to get the performance back to what it was?
That the server environment is more performant for RDP is an interesting suggestion, but I have never heard of this. Can you find a source for RDP performance being impacted? As I said, I have tried the RDP session with all visual styles disabled ("Modem (56kbps)" setting) but this did not improve things. – Logical Fallacy – 2014-04-23T22:10:46.927