Windows RDP (Remote Desktop) can I force to disregard GPU?

2

2

I want to remote desktop into a Windows 10 host but want to force the rdp host service to only consume CPU resources not the GPU at all. When I generally rdp into my host machine I can see that there are GPU resources consumed. I do not want that as I reserve the GPUs in my machine for machine learning purposes that peruse the GPUs.

Can I force Windows' RDP to not consume GPU resources?

Thanks

enter image description here

Matthias Wolf

Posted 2018-03-25T12:03:06.977

Reputation: 375

1RDP uses its own video driver to render remote sessions. How did you confirm the GPU use is due to RDP? – I say Reinstate Monica – 2018-03-25T12:07:43.987

@TwistyImpersonator, I have 2 GPUs installed and only the first one's resources are used when remoting into the host. When I connect via terminal no resources are used. So, my current RDP session definitely consumes GPU resources on the host machine. – Matthias Wolf – 2018-03-25T12:09:55.920

You should use Process Explorer to confirm the TermSrv (RDP session host service) is consuming GPU. Do this by opening the process's details then look on the GPU tab.

– I say Reinstate Monica – 2018-03-25T12:14:42.790

@TwistyImpersonator, I added a screenshot to show which apps consume GPU resources. Those apps are not consuming GPU resources when connecting via terminal, only. – Matthias Wolf – 2018-03-25T12:25:05.723

that's not remote desktop, that's the desktop window manager. Perhaps the correct question to ask is why it's using GPU in the RDP session, though even then I'm not sure what's abnormal about that. – I say Reinstate Monica – 2018-03-25T13:44:48.320

That may be true but clearly rdp uses gpu resources. When logging gpu resources then during each rdp session a significant uptake of gpu resources is noticeable. – Matthias Wolf – 2018-03-26T16:20:02.123

Answers

7

RDP before Windows 10 had its own graphics driver to convert the rendered screen into network packets to send to the client, which used exclusively the CPU. Window 8 was the first to start using the GPU.

Since Windows 10 build 1511 and Windows Server 2016, RDP uses the AVC/H.264 codec in order to support larger screens than full HD. This codec uses the GPU, but only under certain conditions and for full desktop sessions, but otherwise falls back to using the CPU as before.

Using AVC/H.264 is now the default, but you may disable it using the Group Policy Editor (gpedit.msc) and drilling down to :
Computer Configuration -> Administrative Templates -> Windows Components -> Remote Desktop Services -> Remote Desktop Session Host -> Remote Session Environment.

Set the following policies to Disabled, to disable the use of the AVC/H.264 codec :

  • Configure H.264/AVC hardware encoding for Remote Desktop connections
  • Prioritize H.264/AVC 444 Graphics mode for Remote Desktop connections

In any case, non-full desktop sessions should not currently use the GPU (but this could change without notice).

References :

The last reference contains this text :

This policy setting lets you enable H.264/AVC hardware encoding support for Remote Desktop Connections. When you enable hardware encoding, if an error occurs, we will attempt to use software encoding. If you disable or do not configure this policy, we will always use software encoding.

harrymc

Posted 2018-03-25T12:03:06.977

Reputation: 306 093

Very detailed answer and pointed me in the right direction. Thanks a lot – Matthias Wolf – 2018-03-26T16:23:23.610

1

The Windows desktop, regardless of it is used by RDP or locally always consumes some GPU resources on the primary GPU.
It just is designed that way.
The only way I know of to get both your GPU's used dedicated for your machine learning project is to add a 3rd GPU and make sure that one is the primary. This can be a very cheap basic GPU or the build-in Intel HD graphics if your CPU happens to have that.
You may have to fiddle with Bios settings and/or the order of the GPU's in the PCIe slots to get the desired effect regarding the order of the cards. In case of using the Intel HD graphics this usually is automatically the primary GPU, but some motherboards won't enable the Intel GPU at all if there is another GPU present.

Tonny

Posted 2018-03-25T12:03:06.977

Reputation: 19 919

Then how come windows can exist perfectly fine in a headless state without gpu. Also rdp or terminal sessions work perfectly fine with an installed gpu but non functioning driver and obviously without consuming gpu resources. Clearly a gpu is absolutely unnecessary so my question was simply whether a user has the ability to turn off drawing from gpu resources. – Matthias Wolf – 2018-03-26T16:16:54.740

@MattWolf yes, Windows will run without any gpu at all. But if there is one available the window manager will use it. And to my knowledge there is no way around that. – Tonny – 2018-03-26T17:26:51.233

That does not seem to be accurate. When connecting via SSH, none of any GPU resources are utilized. Windows by default does not utilize GPU resources. Now, it seems that as soon as UI features are requested that Windows might use a GPU, if present. And that was precisely my question, whether usage of a GPU can be toggled on or off when using RDP. Apparently it cannot be turned off. But stating that Windows always uses GPU resources, when available, is factually incorrect. – Matthias Wolf – 2018-03-28T01:27:55.663