We recently began load testing our application and noticed that it ran out of file descriptors after about 24 hours.
We are running RHEL 5 on a Dell 1955:
CPU: 2 x Dual Core 2.66GHz 4MB 5150 / 1333FSB RAM: 8GB RAM HDD: 2 x 160GB 2.5" SATA Hard Drives
I checked the file descriptor limit and it was set at 1024. Considering that our application could potentially have about 1000 incoming connections as well as a 1000 outgoing connections, this seems quite low. Not to mention any actual files that need to be opened.
My first thought was to just increase the ulimit -n parameter by a few orders of magnitude and then re-run the test but I wanted to know any potential ramifications of setting this variable too high.
Are there any best practices towards setting this other than figuring out how many file descriptors our software can theoretically open?