2

I'm trying to find out if it's possible to run a Windows Server with one GPU which is shared between all RDP clients so that people could

  • create a session on the server
  • start some program with a UI which needs GPU acceleration
  • disconnect afterwards while the program stays running and gets full acceleration
  • later reconnect to the session

Maybe that's an unusual use case because most things i can find about Windows Server and GPU seem to be about virtualization, f.e. here where it's even mentioned that

if your workload runs directly on physical Windows Server hosts, then you have no need for graphics virtualization; your apps and services already have access to the GPU capabilities and APIs natively supported in Windows Server

which might indicate that is is possible.

I've read about RemoteFX and GPU-Partitioning, f.e. here, but it again looks like this is only for virtualization and i don't care about how fast rdp would update remote screens as long as the running programs get the full acceleration.

Am i searching for the wrong things? Is this even possible?

If it's possible, how would it impact performance when the session is connected and when it's disconnected?

ridilculous
  • 121
  • 3
  • Please add details about that program's needs. "some program with a UI which needs GPU acceleration" is too vague. What are the exact requirements? And was it ever tested on an out-of-the box "vanialla" server installation with onboard graphics via RDP? Most things just run. – Bernd Schwanenmeister Apr 07 '22 at 14:24
  • It's a WinUI application and uses some specific nvidia extensions, f.e. one for external texture storage. I could never test it on a Server over RDP but on a Workstation over RDP where it works fine. – ridilculous Apr 07 '22 at 15:03
  • @BerndSchwanenmeister I also plan to record the app window server side hoping i get the full fps there without being throttled by rdp. – ridilculous Apr 08 '22 at 06:21

1 Answers1

2

As it's a physical server you need to instruct your server to use it's own GPU for the RDP client that connect to it.

It's there;

Local Computer Policy\Computer Configuration\Administrative Templates\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Remote Session Environment

Then enable “Use the hardware default graphics adapter for all Remote Desktop Services sessions

A printscreen done, sorry my OS is in French, but it's the location.

Please note the OS of the users that connect must be minimum in Windows 10 too.

The limit you can hit is more the GPU memory if your application is not intensive on the GPU. It would be the calculate how much users can use the application before the video ram is depleted.

enter image description here

yagmoth555
  • 16,300
  • 4
  • 26
  • 48
  • 1
    Do you know if the fact that this runs in a session and (at least as long as the session is connected) has to stream the screen contents to the client (significantly) affects the performance, compared to running the same app on a Windows Desktop with the same GPU? – ridilculous Apr 06 '22 at 13:55
  • @ridilculous It does affect, but you will have to test for sure to see if it's a good plan or not depending on your application. Multiple factor can cause bad FPS/lag, like if the worker is remote, and the internet link is bad, etc.. – yagmoth555 Apr 06 '22 at 14:07
  • You should just try it. If it works on a workstation over RDP, it will most probably be the same on a server. – Bernd Schwanenmeister Apr 08 '22 at 09:24