4

I need to limit CPU, memory and network bandwidth usage for a bulk of processes on per-user basis. User is in fact just logic grouping for several daemon processes and not real humans. So different users has similar (but not necessarily identical) set of running processes.

Unfortunately, I'm not even experienced Linux user, so I have no idea how to get it. Could you point out possible ways to accomplish this?

1

5 Answers 5

5

Pluggable Authentication Modules (PAM) limits will allow you to apply many of these quota restrictions on a per login basis: http://www.kernel.org/pub/linux/libs/pam/Linux-PAM-html/sag-pam_limits.html and Linux Administrator's Guide

1

Big subject to be honest, someone else will answer far better than me but you could start with 'man setrlimit'.

0

ulimit can accomplish much of this, albeit in a somewhat low-level way. You can use iptables for network limiting.

1
  • Note that ulimit in this context is a shell built in ( at least for bash) , so to get information on this use 'help ulimit' not 'man ulimit' Commented Jun 1, 2009 at 12:11
0

Use virtualization. Then you can be very strict and they can be root of their own VMs if they want.

1
  • I'd be glad, but this stuff itself intended to be some kind of virtualization =)
    – Rorick
    Commented Jun 2, 2009 at 9:59
0

If you're running RedHat or some clone like CentOS you can edit /etc/security/limits.conf to limit resources per user or per group. On other distributions, this config file could be located elsewhere.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .