Portal Home > Knowledgebase > Articles Database > Does suPHP make a single site server less secure?


Does suPHP make a single site server less secure?




Posted by BrianLayman, 02-14-2011, 08:20 PM
This is something I've been debating for a while and I am actually facing the issue for a client right now. OK, so here is what suPHP does for people who want to follow along, and in case someone wants to correct some misunderstanding on my part: suPHP is an Apache module that allows PHP to execute files running as the user who owns the files. Most often, php would either run under the user assigned to Apache or as "nobody" (default). The general understanding is that this increases security because writable directories will never need to be marked as "777" (aka writable by everyone). Also, all files created by PHP w/ suPHP active created under that same owner and usergroup eliminating the need for most users to ever deal with file security. This is wonderful for servers with lots of site running on them. It means that a security hole in a PHP file on one site is not going to crawl out to all of the other directories on the server. It also means that if somehow PHP does create a file in a common area, no other site will be allowed to open it because the owner is set from the original site with the security hole. That's why I run it on all my multi-domain servers. Now here's my issue: Because PHP runs as the owner of all the PHP files in that site, it can actually write to all of those files and directories. You don't have to make your uploads directory special, because PHP owns it all. So wouldn't that mean that single site running under suPHP may actually be at greater risk than a site that is running straight mod_php? Under suPHP, setting a directory to 755 is actually the same as setting it to 777 as far as PHP is concerned due to the 7 in the owner's column. Yes, you could do a chmod 444 on everything you don't want to have writable, but doesn't that defeat the purpose of suPHP while still incurring the cost of running separate PHP instances for each owner? There is a performance hit for running suPHP. Is there a security issue outside of PHP that is addressed by suPHP that I am missing? Or am I accurate in stating that running suPHP on a single site/user/server configuration actually makes all 755/644 files and directories writable and less secure? I'd love to hear people's thoughts on this...

Posted by Squidix - SamBarrow, 02-14-2011, 08:44 PM
On badly written scripts, suphp can actually be worse because it will be running under the owner's user. However the damage will be limited to the user's account, and the files that are writable by that process (the 644/755 files) are ONLY writable by that user, versus writable by anyone (666/777). Running a chmod 444 wouldn't help anyway, because as the owner of the file, the user can change the permissions to whatever they want. We use suphp because it makes things a lot easier for the customer, no more chmodding, which is great especially for customers that don't know much about Linux permissions. This is the advantage of suphp.

Posted by BrianLayman, 02-14-2011, 08:55 PM
OK it sounds like you and I are in agreement then. And good point about the 444. Thanks for the quick reply. I would still love to hear from others...

Posted by artemirk, 02-14-2011, 10:28 PM
Script with "www" user permission in some kind of situation can allow full access to server. And "hacker" will crack all users sites. suPHP script with user permission have access only home dir and "hacker" will crack only user sites. I writing about situation with many users at server. And we try to hardened other users, if one of them use badly written scripts. But if you have only one user at server, you think correctly

Posted by SPaReK, 02-15-2011, 12:02 PM
One way to think of it, do you want to explain to ClientB that their account was sending out spam because of an insecurity in ClientA's website? This is what a shared hosting server without suPHP can result in. Yes, using suPHP can make things worst for a single shared hosting client because an insecurity in a script on that client's account means that someone could overwrite that client's entire website. But, this is why keeping scripts up-to-date with the latest security patches and using safe and secure website scripts is so very important. A user that ignores a Joomla! or Wordpress update, they should realize that this makes their website a target for being exploited.

Posted by brianoz, 02-15-2011, 01:39 PM
Your logic is flawless; I too think suphp is slightly less secure on a single-purpose server (and wouldn't be without it on a shared server).

Posted by srider, 02-15-2011, 01:47 PM
Interesting discussion. One of my servers has about 100 of my sites and about 20 clients. I have suPHP off because I use cross account file access for things like a RSS aggregator that has a single installation use on multiple domains. I have the enviable situation though of providing hosting only by invitation, and I know most of my clients personally, so I'm fairly confident they will not intentionally run malicious code. To protect the domains I own from accidental security breaches on a client domain I use php_basedir and allow only my sites to have exceptions to the rule. This is easy to do with WHM.

Posted by Lightwave, 02-15-2011, 05:13 PM
With suPHP, it generally make sense to: chmod -R o-rwx,g-rwx /home/www/directory In otherwords, set every file & directory of the website to #00. That would cause a problem for a site where you need the ability for one to read the files of another, but that's not too common. suPHP is mostly pointless anyways... because there are better solutions to achieve the same security it provides.

Posted by AvailHosting-Jeff, 02-16-2011, 12:14 AM
I wouldn't bother running suPHP on a single site server. It is more dangerous in the sense that a badly written PHP script could be hijacked and now able to modify the data on your account. However, with proper backups in place and permissions kept to a minimum, there should be little reason for concern.

Posted by foobic, 02-16-2011, 02:46 AM
Agreed. The other security measure you can take on a non-suPHP system is to disable all script execution (PHP and cgi) in the directories with apache write access. So if an attacker somehow exploits your website and manages to upload a shell script, it won't run.

Posted by larwilliams, 02-16-2011, 03:01 AM
As long as all files and folders have proper permissions, SuPHP will always be more secure than running PHP as DSO.

Posted by larwilliams, 02-16-2011, 03:05 AM
The fact that you used the wrong name to refer to open_basedir is concerning, and relying on it for security is even worse. There have been several PHP vulnerabilities in the past 3 years that allow a PHP script to completely bypass open_basedir restrictions and do whatever they want.

Posted by BrianLayman, 02-16-2011, 03:07 AM
How so? What are we missing?

Posted by larwilliams, 02-16-2011, 03:24 AM
There are too many possible vectors to list, when referring to non-SuPHP usage and the resulting "world writable" permissions that would inevitably need to be set on some files/folders. If you are using Apache and Prefork, I would use something better like mod_ruid2. It works by switching to the VirtualHost user name when handling incoming requests (so PHP is run under the user name, same as SuPHP. The main differences are performance (SuPHP is slow) and caching solutions like xCache work properly.

Posted by foobic, 02-16-2011, 04:33 AM
Perhaps you should explain exactly which user will be writing to those "world writable" directories. Bear in mind that the OP has only a single website running on the server.

Posted by larwilliams, 02-16-2011, 05:43 AM
Any service running as non-root that could be tricked into running arbitrary code and installing an exploit of some sort (shell, backdoor) into the vulnerable directory/directories. Granted it would take some reasonably good luck/knowledge on the attackers part, but it is possible. There are many services that run as non-root users, including SpamAssassin (in many systems as mailnull) and Apache itself (as nobody) "World Writable", in Linux terms, means ANY user can write data to the file/folder in question. It does not necessarily need to be root. I know I'd personally prefer increased security for a small loss of speed. But to each their own

Posted by foobic, 02-16-2011, 06:17 AM
To me the main point is that maintaining privilege separation between user and apache is a major security advantage. Even if the website is exploited (which is surely the most likely initial attack vector) you can impose strict limits on what the attacker can do. When you run suexec you throw that advantage away, and for what? The remote possibility that SpamAssassin might get exploited and then need to write to a website? If, as I've already suggested, you also have script execution disabled in the apache-writable directories then even that would get the attacker no further. And of course, there's nothing in the OP to suggest that mail or other services are even running on this server. If this is really a concern then a simple change to using group ownership will allow access both user and apache (and no other user). So again, where's the downside?

Posted by larwilliams, 02-16-2011, 10:44 AM
SpamAssassin was JUST an example, that I came up with because of the vulnerability in spamass-milter that has really only been patched in RPM/DEB packages. Instead, my post was based on 2 things that are likely: 1) That other services are running on the server. 2) That folders would be set to 0777, as mentioned in the original post. Your statement is also partially false: all disabling the execute bit would do is stop some attacks from running where an interpreter is not part of the command. Example: perl /home/user/public_html/perlhackscript.pl The above will still run. Unfortunately, there are many people out there that will set 777 permissions on any folder that they think needs to be writable. Combined with the above, you would have a successful attack. Security is not just about protecting from possible vectors, it's also about trying to mitigate the damage from any mistakes made. Last edited by larwilliams; 02-16-2011 at 10:54 AM.

Posted by foobic, 02-16-2011, 11:31 AM
The question, fundamentally, is about suexec. Not running suexec does add an extra layer of safety and doesn't require 0777 permissions. But even with the dreaded 0777s, which is more likely: that some other service will be hacked, and that the hack requires write access to the web space, orthat the website will be hacked through an insecure PHP script I know which one I'd be betting on! My statement had nothing to do with the execute bit. I suggest disabling script execution through apache in apache-writable directories using something like: The point being that many PHP hacks offer the attacker rather limited access - just enough to upload a file. But if the file he uploads is a PHP shell and you allow him to run it... Doing this under suexec is pointless because the attacker will be able to write anywhere the user can, including overwriting the user's own scripts. Agreed. And one of the best ways to achieve that, it seems to me, is not to run PHP as the user.

Posted by larwilliams, 02-16-2011, 11:42 AM
The OP asked about SuPHP, not SuEXEC. 2 different things. I guess we just have different viewpoints. While I don't agree with yours, I do respect it.

Posted by foobic, 02-16-2011, 11:59 AM
True. I should have used the term setuid - my bad. However, this is really what the thread is about: suPHP is a tool for executing PHP scripts with the permissions of their owners. Likewise.

Posted by soyo, 02-20-2011, 09:07 PM
This is an interesting read... Can I throw three scenarios out to make sure I have followed this right? Sample Environment: Let's say, RedHat/cPanel VPS (Virtuozza) SCENARIO 1: ----------- Usage: Shared using maximum cPanel users to fit based on resources. Open to the public, anyone can sign up. Run suPHP or not? YES Why? suPHP reduces security for each individual account, but keeps security breaches from extending to other accounts on this VPS. Even though it may slow down the server ~20%, it is necessary to run this because at any point a malicious user could sign up and start creating problems. In addition, it is difficult to know what scripts are being uploaded, or if individual users are even updating their scripts. So the chance of an exploit is higher. SCENARIO 2: ----------- Usage: Shared (let's say 10-15 cPanel users) Semi-Private (invite or client-only) so very unlikely that a malicious user is able to sign up. Run suPHP or not? MAYBE Why? suPHP reduces security for each individual account, but keeps security breaches from extending to other accounts on this VPS. It can slow down the server ~20% as well. But, since there are limited accounts, and a central administrator is setting them up and managing them all (In this scenario, clients/users may have access, but are not going to install things without contacting central administrator), it may be preferable to retain faster speed and increased security for individual accounts in this environment - so long as the administrator will assure that all installed scripts and server components are updated regularly, and install the necessary mods to maximize security for this environment in mod security mode. SCENARIO 3: ----------- Usage: Single User One single cPanel account, running one large web system Run suPHP or not? NO Why? Because in this environment, there is no concern about cross account contamination, so increased speed as well as increased individual account security would be preferable - again, assuming proper updates/setup is maintained as outlined in scenario 2. Are these accurate assessments?? Thanks for any thoughts on this Last edited by soyo; 02-20-2011 at 09:12 PM.

Posted by brianoz, 02-21-2011, 03:12 AM
Based on a quick read, you're correct. suPHP partitions users nicely, otherwise they can read each others' files and the results are an ongoing, subtle, nightmare for the server and for you as an admin. As mentioned above, there are better (faster performing) replacements for suPHP these days which you should investigate - suPHP forks a separate process to serve each new PHP page, which is terribly slow, although in practice not a huge problem on a server that is not overstuffed with accounts. For instance, FastCGI is one of several other alternatives; from memory it works by starting up a small pool of PHP server per account, and incoming requests are then handed off to an existing process rather than starting a new one. I guess part of the point here is that if your server isn't too heavily loaded you won't notice the improvement from replacing suPHP in the short term. There's no doubt as time goes on people will migrate away from the older suPHP approach though. Another part of the point is that suPHP isn't so much protecting from malicious users, as that's fairly rare in my experience. More commonly it protects against subverted accounts where a PHP script/subsystem has been compromised, and the hacker is seeking to extend the compromise to other accounts on the server. [Although we did once take over hosting for an organization that had been hacked every time an election was held via this method, so at least some malicious users do exist!] Last edited by brianoz; 02-21-2011 at 03:15 AM. Reason: add final para on compromised PHP scripts

Posted by funkywizard, 02-21-2011, 07:42 AM
I would agree that suphp can expose further security vulnerabilities, insofar as a vulnerability that could not be exploited in DSO when setting the proper permissions, that same vulnerability could be exploitable running in suphp because the php script can change permissions on all the files / write to whichever files it wants to. theoretically, a fully secure php script will be equally secure in DSO or suphp, but in practice, suphp will help keep buggy scripts from impacting other users, whereas DSO provides more desirable restrictions in general against buggy scripts from maliciously editing things that are not 666 / 777

Posted by soyo, 02-22-2011, 08:41 PM
Thanks for the comments. Coming from a Windows based environment, I'm just starting to consider the two primary options discussed here... but also fastcgi, as also mentioned previously... So, if fastcgi has the advantages of DSO (speed and ability to use accelerators) and suPHP (cross account security and easier permissions), then why aren't people just switching to fastcgi?? (or maybe they are and I'm just living under a rock?)

Posted by Hsunami, 02-22-2011, 09:02 PM
The way FastCGI works is that it has persistent processes running at once waiting for traffic, while suPHP has to spawn a new process when a new request comes in. So for FastCGI, you really have to tweak it based on the amount of traffic your site is receiving. Otherwise, you end up with not enough processes or too many idle processes wasting memory. Due to the fact that you have to tweak it and that it's not truly ready to go out-of-the-box, I'd imagine that would be the reason why people have not opted to use it.

Posted by funkywizard, 02-22-2011, 09:12 PM
because DSO is actually more secure in some ways, because you can lock out your php scripts from write / execute permissions on your files. With fastcgi, php runs with the same permissions as the user that owns the php file, which can magnify the impact of any potential vulnerabilities in your php scripts. fastcgi is a bit more secure in a multi-user shared hosting environment because a problem with one person's script won't affect other customers, but on a single user server DSO is going to be more secure in general. Both DSO and Fastcgi will perform similarly, with fastcgi using less memory

Posted by brianoz, 02-23-2011, 04:14 AM
Yes, but DSO allows an account to steal passwords from other accounts, which creates serious problems - privilege escalation, hacked sites, etc etc. At least with suphp/fastcgi/etc since PHP runs as the user, it's locked away in their account and can't hack other accounts on the server ...

Posted by funkywizard, 02-23-2011, 04:39 AM
yes, again, in a single user environment, dso is clearly more secure, in a multi user environment, there are pros and cons to each option, neither really is going to give you the results you would want.

Posted by tkeith, 02-25-2011, 12:02 AM
Personally, to get the both of both sides, I run mpm-itk (on Ubuntu the package is apache2-mpm-itk). This lets you set the user/group ID that Apache runs as per virtual host. So, for user joe, you can have a separate user joe-www that executes all of joe's scripts. This way, joe can't interfere with other users' data, since joe-www user should only be able to access joe's www directory. However, even if the joe-www user is compromised (say, via an insecure PHP script), the attackers still wont have write access to all of joe's data.

Posted by BrianLayman, 02-25-2011, 01:23 AM
Oh that's interesting! I'd not heard of this one before. It sounds really slick. I saw one article that said it wasn't good for mass vhosts.. but I can't hazard a guess as to what their definition of mass vhosts is...

Posted by brianoz, 02-25-2011, 07:49 AM
Well, it depends on what you'd call a "pro" and a "con". DSO isn't secure in a multi user environment, simple as that; anyone running it in multi-use (untrusted) scenarios needs their head read. It's not a matter of pros and cons - anything that almost completely voids server security is an unmitigated disaster and users would be horrified to know that the contents of their databases could be read at will by hackers on DSO servers. I'm not kidding, it's really that bad. suphp is slow, which is only noticeable for high load environments, but there are faster, newer, methods available which give similar results - ie suphp itself is effectively superceded, although the concept of running as the user themselves isn't. (For single user, both for speed and for security, DSO of course makes sense, but I'm not debating that.)

Posted by FastServ, 02-25-2011, 11:38 AM
SuPHP is unnecessary on a single-site server. Aside from the implication you mentioned, you also lose performance over other options which are far more ideal for single site, high performance servers.

Posted by TheJoker, 02-25-2011, 09:41 PM
Good read. Thank you for all the information. I have a VPS that hosts my WordPress and my Wife's WordPress sites. Unfortunately the automatic update function doesn't work with DSO without supplying the password to FTP each time, which is a pain. So I'll stick with suPHP.

Posted by QuantumNet, 03-14-2012, 08:37 PM
Sorry for bumping an old thread but, people use these topics as a resource and I had to debunk some bad information being spread in this thread. Just because you read some article on the internet that convinced you that suphp is more secure the DSO, doesnt make you an expert... suPHP is more insecure than a proper set of file permissions period... it is impossible for something that has full control over a set of files to be secure. It is security for dummies and allows those dummies to pass the buck on to the end user "the customer", it is no different then safemode... I bet you went around telling people they were crazy for not using safemode before php yanked the plug on it... it was yanked for a reason and so should suphp and suexec be the same I suggest you take some more time to do some research on unix file permissions and php security techniques before you go making such bold statements in public forums. There is a number of technologies that can be combined to make a shared hosting server more secure than using suPHP or suexec. 1) php exec_dir patch 2) proper file permissions 3) suhosin 4) open_basedir 5) .htaccess files which restrict script execution on world writable directories Order allow,deny Deny from all That is just the start... There is more that needs to be done to a system to secure it than just that, but then again we host shell accounts and give our users shell access on their web hosting accounts... What the heck do you think these type of companies do for security? obviously having shell on a system is better than exploiting PHP... if you think suphp is your answer then you have got a lot to learn about system security. System security is not about preventing them from getting in because they will always get in, if it be a system application such as exim, or a websites forum software such as IPB, it will always happen. So the solution is limiting what they can do once they are in... suphp is not the answer. correct file permissions and a set of proper security technologies is. Shame on you for only thinking about your system and not your customers business. Your statements about an attacker reading every database on the system is BS... an attacker cannot read what they dont have permission to read. Get a book on unix file permissions then learn how to set your system up so they cannot read what they shouldnt even if they compromise another users account... or hire an expert, but please dont spread B.S like this around for others to adopt your bunk ideas. Last edited by QuantumNet; 03-14-2012 at 08:41 PM.

Posted by brianoz, 03-15-2012, 01:40 AM
There's a subtle point here - security isn't just as simple as just one issue. The issue you're using to assess security in this instance is "can an application write on it's own files?". Assessed in that light, suphp of course fails dismally - a web app can overwrite it's own files, there's no denying it. However, there's another light in which security could be viewed and it is this: "can an application read other people's files on the server?". DSO allows other users on the server to read PHP files - granted, safe mode, suhosin and open_basedir make that much harder, but they're not 100% and can never be 100% (and unlike yourself, most people who run DSO don't configure them anyway!). One of the precursors to being able to have any security at all is not allowing people to read other people's files. If you think that suhosin, open_basedir etc will prevent that (which is understandable) you're unfortunately wrong. If I have to choose - and I hate having to choose, but I do - I'd rather an account be able to hack itself than other accounts on the server. Good security is multi-layered. Suphp and family (including the superior and faster more recent fastcgi and other implementations - suphp is getting a bit old these days) is only *one layer* of a security solution - anyone thinking one layer is sufficient to keep them safe by itself is like someone thinking their skin will keep them dry when it rains. However, this thread and my point was about a single topic, so that's all I discussed, but since you asked .... These all have their places, sure, and we use a lot of these on our servers, but fundamentally, if one user can read another user's files on the server there's something wrong - it just violates a basic principle of security design. In most cases, certainly on cPanel, running PHP as DSO runs the PHP interpreter inside the Apache server, which usually runs under the Unix "nobody" user. In effect this results in DSO running PHP as a single user - usually "nobody". As this user requires access to all files on the server, any one account has the ability to read every PHP file owned by every user on the server. Open_basedir, safe_mode, whatever, may slow people down but one way or another there's always a way around these for a smart hacker. You're partly right here, and partly wrong. Good security **is** about keeping them out. It has to be. Once they're in, we've failed fundamentally - it's only a hop-skip-and-jump to getting root, if they know what you're doing. Suphp(*) is a part of keeping them out, and in my experience over many, many years of running Unix servers, is a pretty darn good idea - but only one component of a solution. AND - as well as doing your best to keep them out - it's also about making it hard once they do get in - which is where your concepts come in. Remember my multi-layered point above? Of course, people will get in sometimes, and let's pray they don't really know what they're doing. Making their life as hard as possible is very important - that's often enough to stop the skript kiddies as well as many automated intrusion attempts. There are many other concepts which are important - for example, one of them that's crucial in a suphp/fastcgi environment is the followsymlink patch mentioned on the cPanel forums, which hopefully will be incorporated into Apache core at some point. (This patch doesn't apply to PHP as DSO as it always has access to read the files anyway) And there are many other layers/tasks to be done in securing a server - CSF, mod_security, cxs, ftp logging, good backups and backup rotation (see our site www.whmscripts.net), explicit checks for scripts such as timthumb.php etc, and the list goes on. In our case, the list of tasks to secure a server involves well over 100 steps/subsystems, and I'm sure we're similar to a lot of others out there. I hope I'm now appearing a little less naive than that post may have made it seem! Apologies if the language in the initial article was inflammatory - I didn't intend to be rude. If I haven't been clear about anything please let me know so I can expand? (*) Whenever I say "suphp" I include all the similar, and newer alternatives, which are actually better performing solutions - such as fastcgi etc. ps: at the end of the day, there's simply no stopping a really bright and very determined hacker. One just has to help them not be interested (make it harder, and don't piss them off!). Last edited by brianoz; 03-15-2012 at 01:55 AM.

Posted by brianoz, 03-15-2012, 07:42 AM
This is completely correct; I was writing earlier from the perspective of multi-user shared hosting. On a single-user host, DSO is more secure, as the user can't hack/write their own files, and there aren't any other users to hack.

Posted by Website themes, 03-15-2012, 08:36 AM
Is phpfpm an alternative to suphp? What other technologies act like suphp (php running as owner of files)? We've already heard about php fastcgi. Any others?

Posted by SPaReK, 03-15-2012, 11:43 AM
I thought I would chime in here. I agree with what brianoz is saying here. When you compare "vanilla" DSO PHP to "vanilla" suPHP PHP, permission securities can be applied. For example, two clients on a web server, ClientA and ClientB. ClientB opts not to upgrade their Joomla! core script and someone exploits it. This can mean that the person exploiting ClientB website can read a config.php file on ClientA's account. If that config.php file contains MySQL login information, then this exploiter could conceivably then gain access to read ClientA's MySQL database and could retrieve their administrative login information. (Hopefully this would be encrypted in some manner, giving a layered approach to security, but fundamentally just being able to read encrypted data on someone else's account is a security flaw). If "vanilla" DSO PHP is being used (I say vanilla meaning without any extra patches being applied to the PHP core, comparing apples to apples). Then the PHP scripts on ClientB's hacked Joomla! site are being executed as the same user, with the same level of permissions, that would be necessary for ClientA to have on their config.php file. Essentially, this means that in a "vanilla" DSO PHP scenario, the file permissions on ClientA's config.php file has to be set to 444 or higher, or owned by the Apache user with a 400 setting. But in any case, both PHP scripts on ClientA (who is suppose to have access to the config.php file) and ClientB (who isn't suppose to have access to the config.php file) have read access to the config.php file. Now in a "vanilla" suPHP PHP environment, ClientA can set the permissions on this config.php file to 400 with the file being owned by ClientA. PHP scripts that run on ClientA's website run as ClientA's username, and thus has read access to this config.php. But PHP scripts on ClientB's website run as ClientB's username, and they do not have read access to ClientA's config.php file. So even if ClientB's Joomla! gets exploited, that person won't be able to read ClientA's config.php file. That person will be able to wipe out ClientB's complete website, but whose fault is that? The caveat here being that ClientA has to know that they are in a suPHP environment and they have to understand file permissions and proper lock down procedures for restricting permission access to these confidential files, such as a config.php file with important information. Most do not know of this or don't understand this enough to apply the correct permissions, and so they leave their config.php files set to the default file permission (usually 644) and then there's no discernible advantage to using a suPHP environment. Open_basedir, safe_mode, suhosin may help in regards to this in a DSO environment. But you can't compare a "vanilla" suPHP environment to a DSO environment with every security patch under the sun being applied.

Posted by Ramprage, 03-15-2012, 12:00 PM
To add to the multi-layer security notes, I didn't see any mention of mod_security yet in this thread. mod_security is often overlooked but extremely powerful in helping prevent exploits with a proper rules configuration. Heck, it will even secure insecure old scripts for you with proper rules. In any shared environment you need to go on the basis that no customer files or sites can be trusted, no matter the number of other users on the environment. 777 permission should be avoided at all prices. If someone manages to breach your multiple layers of security and run a shell, you've just handed them access to all your other customers files that are 777. My suggestion is to further not use shared hosting at all if you have a critical business website running. Grab yourself a managed dedicated or VPS with a reputable vendor. There are way to many security risks in shared hosting, no matter the layers you add on imo. For the average site or small website shared is fine. I'm talking websites that are your main source of revenue or critical to business operations, do not used shared hosting period. Last edited by Ramprage; 03-15-2012 at 12:03 PM.

Posted by brianoz, 03-15-2012, 10:45 PM
Just in case people are searching on this as our friend QuantumNet suggests above, suPHP currently has a rather serious symlink bug and you should apply the patch mentioned in this cpanel forum thread to avoid potential problems for your shared hosting clients: http://forums.cpanel.net/f185/how-pr...tml#post996441 Without the patch, sym links can be used to view any file on a shared host. This patch was written by StevenC of Rack911.com. Last edited by brianoz; 03-15-2012 at 10:46 PM. Reason: add attribution...

Posted by hostingxchange, 03-15-2012, 11:00 PM
Perhaps I didn't read all the posts through as well as I could have, but there is one simple fact: If the site is to allow uploads, or files of any kind to be changed by the web server, then suPHP is always going to be more secure as world writable files or folders would not be required.

Posted by QuantumNet, 03-16-2012, 12:10 AM
No not true... SuPHP makes every directory world writable just without the 777 or 770 bits... How do you think PHP creates and deletes files to them... any of your customers files can be deleted when using PHP... an attacker can upload whatever they want then execute whatever they want when using suPHP if you block script execution access to 777 and 770 folders as I explained in my other post then there is essentially no directories for them to upload exploit code to... well they can still upload it they just cant execute it. How are you going to block anything from being uploaded or modified or executed on a suphp site? you cant.

Posted by QuantumNet, 03-16-2012, 12:28 AM
Let me explain why this isnt true with a properly configured DSO system, including disable_functions exec_dir and my method of dropping the htaccess into 770 and 777 directories via cron Lets say we have a server and on this server there is 50 accounts running Invision Power Forum, my shell script scans the server every 5 minutes and drops the htaccess file into any 777 or 770 directories, now said attacker has a 0day exploit for IPB that will allow him to upload a shell script, he then begins targeting all 50 sites (in the real world this doesnt happen cause he doesnt yet know about all 50 sites but lets just say he did, because if he did his window of opportunity is greater) well his script uploaded to all 50 sites but they had the htaccess file preventing execution so he is unable to use his script do to the htaccess rule... okay now lets just say on the off chance in this 5 minute window a user had made another directory 777 and his automated script happened to know which directory it was maybe due to it being a common directory. Well the script could not execute due to the PHP disable_functions so no shell code... but what if he was just using reafile to grab up some configuration file mysql passwords? Well in the best case he would have exactly 5 minutes (real world he would have less as his attack would have to be perfectly aligned with the find command in order to have a full 5 minutes) to get as many as you could and he would have to know the exact location of them all... remember he cannot `ls /home` because we prevent him from doing so in file permissions, he cannot `ls /home/username` because that folder is chmod 710 so he can only open files in which he knows the username and knows the exact path of the configuration.php file. But lets step back in to the real world, he doesnt know all the paths on the server and he cant find . -name because of the 710, even if he knew exactly where a 777 directory was uploading to it would be useless because the timing would need to be exact as in Web User sets chmod 777 on new directory at 1:01:00pm -> attacker uploads to directory at 1:01:03pm -> attacker executes script but script has no idea what paths to find configuration files on other accounts sso he defaults to grabbing the configuration.php for the current account. -> 1:05:00pm script no longer works because that directory is now sealed by htaccess file. To add this doesnt work on sites that have been live for a while because the htaccess file never goes anywhere. So there is a 1% chance that he could actually pull something off if he just happened to hit at the same time the user set chmod 777 (you cant race if you dont know what folder was being set 777) Now lets do the same scenario with a suexec/suphp protected server, again Lets say we have a server and on this server there is 50 accounts running Invision Power Forum: The attacker sets up is freshly made 0 day to scan for IPB forums, it locates the 50 forums on our server, his exploit code goes to town uploading said file in any directory he wants because in all reallity all folders on suexec/suphp are 777... without any complication at all he has successfully uploaded his exploit to all 50 sites with 100% ability of doing so not 1% chance. he now reads configuration.php on all 50 sites and then proceeds to dump their database. Hmm odd seems to me from an attackers point of view it is much easier to deface a but load of websites running suexec/suphp and dump their databases... there is no security for websites in a suexec/suphp setup, there is only security for the server. Oh and yes finding any site running IPB is easy thanks to google: https://www.google.com/#hl=en&sclien...w=1920&bih=912

Posted by QuantumNet, 03-16-2012, 12:32 AM
Anyways, We could go back and fourth all day on which one is more secure the truth to the matter is neither is more secure than the other... mod_php with a properly configured system vs suPHP can both be easily exploited... it is just the method of attack is different and the mod_php method doesnt allow a site to get defaced or files to be modified with malicious code or deleted Is SuPHP easier to set and forget? absolutely it is a quick and dirty fix but introduces its own set of vulnerabilites is mod_php able to be secure absolutely but it requires more knowledge on the administrators part to make sure the usual attack methods are plugged... but when either are perfectly configured they are both still susceptible to sql injection attacks due to the nature of web software. but truth be told anything I can do to a suPHP system I can do to a mod_php system just in different ways.

Posted by SPaReK, 03-16-2012, 11:45 AM
I will actually give you this one. In the scenario you paint, a zero-day exploit, an exploit that goes out into the wild before IPB is able to patch their script, using suPHP can be more insecure. But, I would ask, what is more prevalent? 0-day exploits or IPB website owners that never update their scripts? If you have any experience or dealing with web hosting and security, you know that it's people keeping their website scripts up to date. Website owner's loathe updating their scripts, they are afraid something will break. And I won't dispute that fear either, a lot of times updating a script does seem to break the script's functionality on their website. But is that a valid excuse for keeping your website free from being exploited? Don't come crying to me when you don't update your Joomla! website and wake up 4 months later to find it hacked and deleted. But, this is also the reason why you should use reputable scripts for your website. While every script is vulnerable to a zero-day exploit, a major reputable script like IPB or like vBulletin will update and fix that security hole a lot faster than "Joe's Website Forum" script. How important is your website to you? That should play a role in determining what scripts you put up on your website. Likewse, it is up to you, the website owner, to sign up for release mailing lists or RSS feeds, or to otherwise stay on top of the security of the scripts that you are using. If your website is important to you, you'll be informed as soon as IPB releases an update, and you will update your IPB forum as soon as you can. Someone earlier mentioned using mod_security as a layer of security to keep outdated scripts safe. This is somewhat true. Mod_security can provide a level of protection. But nothing compares to the security you'll gain from keeping your script up to date and using reputable scripts. Now, in your scenario where you are using a crontab to place .htaccess restrictions on open directories on your server. What happens if the owner of that account deletes that .htaccess file? What happens if the user already has a .htaccess file in the directory with special instructions? You mention that it runs every 5 minutes, this may work fine for a 50 account server, but does it scale? What if there are 1000 accounts on the server? What if there are 50,000 open directories? Depending on the server and what it's doing, it's not going to instantly place this .htaccess file restriction into directory entry number 50,000. It will have to go through each one, one at a time, 1... 2... 3... ..... all the way to 50,000. This will take some time. The point I am trying to make, yes suPHP makes YOUR website more vulnerable if you are not keeping it safe and secure. But who is responsible for keeping YOUR website safe? Yes, in a zero-day exploit scenario, suPHP can make your website more vulnerable. But I would rather trust mitigating that by using reputable, well programmed, and secure scripts that I keep up to date, than to depend on a hodge-podge of .htaccess rules. And anybody that comes across this thread, Ramprage is completely right. If the security of your website is of the utmost importance to you, then host it on it's own dedicated server. Then you don't have to worry about any of this. You don't have to worry about the security of other web hosting accounts affecting your website. It's a cost vs. peace of mind thing. You might get a $5 hosting account, but if you are constantly worrying about the security of the server and the security of every other shared hosting account on that server, then it might be worth the $100 or $200 extra you'd have to pay for your own dedicated server. Only you, the owner of your website, can make that decision.

Posted by brianoz, 03-18-2012, 03:15 AM
The cron-.htaccess-777 trick is really nice, hadn't thought of that one, great idea! However, I'm not so sure the total scenario actually plays out in real life the way you think it does, and I think that's valuable enough to share. You're correct in saying the uploaded script won't run on all 50 sites. But what if the hack didn't require a writable directory? Or runs "php badscript.php" via a command loophole somehow? You're then open to attack. The attacker will run their script, and go through /etc/passwd enumerating home directories and looking for well known config files. They will download the db usernames and db passwords from these files and keep them, using them to probe for admin (ie cpanel/plesk etc) logins as well as downloading the contents of databases. Any MD5 passwords will be downloaded and analyzed via dictionaries. In short, you have a server owned scenario. The problem here is not that one account has been hacked, it's that all other accounts on the server can be hacked as the hacker only has to find one vulnerability anywhere on the system to allow them to read all the config files on every site on the server. Unix file permissions are such with DSO so that any user can read all other user config files. Perhaps, on a vanilla suphp site, but suphp is best run with mod_security, which if you run a good ruleset will stop 90% of attacks, if not more. This needs to be qualified. Yes, they can read the passwords on the sites they hacked, but only one those sites, not from the whole server. The difference here is "site owned" vs "server owned", which is a huge difference in terms of effort required for remediation. Hmm odd seems to me from an attackers point of view it is much easier to deface a but load of websites running suexec/suphp and dump their databases... there is no security for websites in a suexec/suphp setup, there is only security for the server. As far as I can see this is a generic search for "powered by" which doesn't show you IPB software running on a single server - so you'd need to analyze that some other way (the uber baddies have databases of just this data, is my guess). As you don't have access to work out the domains on the server, that's gonna be harder, and unlike DSO you can't just go in and look for a standard set of config files across every account on the server as Unix file permissions block your access. As an admin, I'd choose "site owned" every day over "server owned". That's just me. Granted, you're taking some smart measures to reduce the chance of script execution after upload, but it only takes a single vulnerability to expose your entire server. Totally agree, we could go back and forth all day, and you make some good points. suphp is not a quick and dirty fix for anything, in and of itself. It's like saying open_basedir is enough to protect a DSO server - it's just one part of a wise man's armory, just one layer in your protective quilt composed of many layers. Someone assuming suphp by itself is sufficient to save them has a fundamental flaw in their security approach. I can't stress the importance of mod_security coupled with something like CSF enough. If CSF bans an attacking IP after a number of hits it becomes much harder for attackers to probe a server. Sure, the smart ones switch to a new IP (the smart ones use botnets), but it's still helping. For what it's worth, I'd still argue that suphp as part of a toolset is philosophically more secure than allowing PHP access to every file on the server. This has certain relevance on shared servers, of course. Thanks for making some good points, for educating us with some great ideas, and for contributing to a thoughtful discussion!



Was this answer helpful?

Add to Favourites Add to Favourites    Print this Article Print this Article

Also Read
need help in my script (Views: 688)


Language: