I have a Windows 2012R2 server acting as remote desktop gateway for a number of RDSH servers inside the perimeter, and have a firewall between it and the Internet allowing access from outside to that gateway. Our users report problems that their RDS connections via that gateway occasionally break with errors related to RDG server drop, while RDG only reports "client disconnected" with statistics. Investigation has discovered that the firewall is overloaded at times, dropping packets from its input queue. Sadly, firewall is an appliance that is not easily replaced (replacement will take some 6 months, and the problem is of course to be solved yesterday), so I have to make my RDG work in a network that's congested by default.
Are there any settings that I can apply to RDG so that it won't drop UDP connections for an extended period of time, or maybe not advertise UDP connection at all, just to make sure that eventually TCP would push a L3 packet through the firewall and the connection won't break?