3

i am using an apache webserver on debian, handling 12 different websites. Two days ago i suffered an attack and a hacker uploaded a php shell thru ftp in one of those 12 sites.

What i'd though about this shell, is "bah, this only can access to the www/ folder, he cannot go back" but here's the problem, with that shell he could acces even into the / folder and see all the folders-documents he'd want to (mailq, users, all websites files...), he could navigate around ALL my vps watching all documents and its content (not modifying them!), without modifying them.

I've been thinking these last days about it and i suspect is a www-data permission issue or something like this, but couldn't find any solution.

So how could i make that if i navigate on site1.com (at my vps) i'd be using a user that only could access to that directory?

In any words, if a hacker uploads a php shell again, i want him not to look at the rest of the documents behind /var/www/site1.com/www/

Thanks guys!

user61264
  • 31
  • 2

2 Answers2

1

What you've asked for is a good idea but in practice can be very difficult to implement.

There's really no way to stop the attacker from seeing files that are accessible to the web server...because the attack is coming in via the webserver, you can't block access without making the files completely inaccessible. You can protect other sensitive data on your system by making sure that it can only be accessed by specific groups...for example, make sure that only the "mail" group can access our mailq. This means (a) creating the necessary groups, (b) setting the necessary file/directory permissions, and (c) ensuring that any daemons run with the correct credentials.

If you're looking for a more robust solution, you can use some sort of lightweight virtualization solution (e.g., Linux Containers, http://lxc.sourceforge.net/) to create virtual private servers for each site, but this is both more time and resource intensive.

You could run each website as a separate user ID. This is a little tricky; the easiest is to run one instance of Apache for each site on a specific port, and then using Apache's Proxy module to delegate access from your main server on port 80. Since this solution involves one Apache instance per site, it also has resource consequences. There are modules that will let you accomplish this within a single Apache instance, see http://blog.andreaolivato.net/open-source/running-apache2-virtualhost-with-different-users.html for an example.

larsks
  • 41,276
  • 13
  • 117
  • 170
  • Thanks for that fast answer, so basically it's not a "critical problem" ? i mean, a lot of webservers do with like this? – user61264 Nov 23 '10 at 17:08
  • The critical problem is making sure that the web user does not have access to critical data on the system (and does not have the ability to write files in places where they will be executable by the web server -- this is typically how attackers exploit php). For hosting unrelated websites, many people use some sort of virtualization solution for increased isolation of the websites, but a lot of places do exactly what you're doing. – larsks Nov 23 '10 at 17:15
  • Do you know the name of some virtualization solution? as those 12 websites are unrelated. – user61264 Nov 23 '10 at 17:30
  • And just a question, if instead of having 12 sites i have 1 site, how could i disable user www-data for having permission to read / files? i just want him to look at /var/www/ files, not back. – user61264 Nov 23 '10 at 17:47
1

I would recommend using a chroot'ed environment with mod_security. This way, any compromise of the Apache daemon (or the chroot user Apache runs as) will only expose the chroot'ed tree, not the entire server itself.

Documentation here: http://www.modsecurity.org/documentation/apache-internal-chroot.html

Sam Halicke
  • 6,122
  • 1
  • 24
  • 35