4

I need to limit CPU, memory and network bandwidth usage for a bulk of processes on per-user basis. User is in fact just logic grouping for several daemon processes and not real humans. So different users has similar (but not necessarily identical) set of running processes.

Unfortunately, I'm not even experienced Linux user, so I have no idea how to get it. Could you point out possible ways to accomplish this?

Rorick
  • 143
  • 2
  • 5
  • This is a very general question. And the best solution will be for you to study this link in detail. [Linux Limiting and Monitoring Users](http://www.linuxtopia.org/online_books/linux_administrators_security_guide/16_Linux_Limiting_and_Monitoring_Users.html) – Muddassir Nazir Jun 02 '15 at 06:16

5 Answers5

5

Pluggable Authentication Modules (PAM) limits will allow you to apply many of these quota restrictions on a per login basis: http://www.kernel.org/pub/linux/libs/pam/Linux-PAM-html/sag-pam_limits.html and Linux Administrator's Guide

Kyle Brandt
  • 82,107
  • 71
  • 302
  • 444
1

Big subject to be honest, someone else will answer far better than me but you could start with 'man setrlimit'.

Chopper3
  • 100,240
  • 9
  • 106
  • 238
0

ulimit can accomplish much of this, albeit in a somewhat low-level way. You can use iptables for network limiting.

Alex J
  • 2,804
  • 2
  • 21
  • 24
  • Note that ulimit in this context is a shell built in ( at least for bash) , so to get information on this use 'help ulimit' not 'man ulimit' – Kyle Brandt Jun 01 '09 at 12:11
0

Use virtualization. Then you can be very strict and they can be root of their own VMs if they want.

Allen
  • 1,315
  • 7
  • 12
0

If you're running RedHat or some clone like CentOS you can edit /etc/security/limits.conf to limit resources per user or per group. On other distributions, this config file could be located elsewhere.

zero_r
  • 2,345
  • 2
  • 15
  • 16