Portal Home > Knowledgebase > Articles Database > Oh no! Need to beef up site before TV exposure!!!
Oh no! Need to beef up site before TV exposure!!!
Posted by tcharles72, 08-22-2007, 05:31 AM |
Hi all,
I've just found out that my website will be featured on a major TV show in just a couple of weeks! It has broad appeal, so I'm expecting usage to go through the roof.
I've got a hundred things to do before then, so I need to find an expert to take a look at my dedicated server - and recommend an upgrade path.
My questions...
1) Can anyone recommend a trusted systems architect who can tell me what setup and hardware I need (as well as tune Apache, MySQL and Postfix).
2) How many concurrent users should a box like the below be able to support?
- CentOS 4.2 on a Dell PE850 P4 2.8Ghz with 1GB RAM and 2x80GB disks (RAID001)
- The network connection is called "10Mbps Flatfee (Volume)"
- The server runs Postfix, Courier-IMAP (5 clients), MySQL, PHP (dynamic site), Apache, a few backend perl daemons, Samba (3 clients) and Nagios. About a dozen users connect via SSH as well.
3) Obviously the easiest thing would be to upgrade the hardware. However, it probably makes sense to begin splitting services among one or more boxes for fault tolerance. Does that imply a dedicated MySQL server and two Apache servers?
I'd appreciate some advice and tips based on past experiences.
Thanks in advance.
|
Posted by LoganNZ, 08-22-2007, 06:36 AM |
Hi tcharles72
10mbps isn't going to be enough for a high amount of users. You want a stretchable 100mbps.
Web server users :: I honestly, don't know. I don't know how resource intensive your site is... How much dynamic content is there? How many concurrent processes does mysql have to run atm?
Seriously, If I was going to get a heap of TV traffic, I would look at getting 2+ servers from a well known dedi host such as TP ( Theplanet ) fdcservers, fastservers.net . They are fairly cheap depending on what sized servers you get. However, depending on your site dynamics ( php / mysql setup ) then they may be able to setup a simple load balancer which you can add more servers too as your load is increased.
What is SSH being used for? As your TV hits are increased, your security risks are increased. keep an eye on ssh users password security, brute forcing is still popular.
Gimme a PM if you need any further help
|
Posted by FIAHOST, 08-22-2007, 06:54 AM |
The hardware wouldn't make it. Early this year, a TV actor we host had a TV exposure but he didn't inform us about it. A dual Xeon server crashed within minutes. It was unfortunate for him to get a TV talking about his website that was.... down.
Next time he appeared on the same TV, we took a more aggressive approach and everything was okay.
You really need to evaluate the exposure you are going to have. All the problem is that you need to dimension your hosting system just to support peaks during 24 to 48 hours. After that, it would be overkill to keep the same servers. So avoid to enter in lengthy contracts.
|
Posted by ximi, 08-22-2007, 03:00 PM |
Go to your Apache settings and change "Timeout" to 10 to however many seconds you want. Make sure KeepAlive is on but change the KeepAlive timeout to something low. On my web server, I have 5 seconds because that's about slightly more than the time it takes from the last request of a page load to the next click for just users glancing through a few pages on your web site.
This will help those "Connection Refused" error messages during traffic spikes.
If there are any scripts you can disable from your pages, do it.
|
Posted by bin_asc, 08-22-2007, 08:25 PM |
You could try to replace apache with litespeed from litespeedtech.com, it works awsome and it`s being used by top notch sites like : Wordpress etc .. ( you can lookup their clients list on the site ). It totally integrates with apache, so you can choose to load it`s file, and sites will work directly. Also, do upgrade your NIC port. 10Mbps isn`t that much. Also, consider adding more RAM and possibly a hardware firewall since you stated you`ll be getting some quite heavy loads.
|
Posted by plumsauce, 08-22-2007, 08:43 PM |
One more possibility, switch over to static for the period in question if the site structure permits it. You can also look at load balancing. Or, get N more server accounts for a month and round robin between them. Or, a bunch of proxy accounts as long as they don't crush the origin server.
|
Posted by Steven, 08-22-2007, 11:20 PM |
It totally depends on the application. Primarily on these factors:
1.) how dynamic is the site?
2.) The intensity of database requests?
3.) willingness to implement something like memcached into your code to offload db load.
Also the genre has a lot to do with how busy a site will be. I have seen sites that worked fine on a shared account that get mentioned on tv, and then again I have seen sites that needed a rapidly deployed set of servers to keep it online.
How target is your site going to be?
Honestly if I had a site that expected an unknown amount of traffic I would setup something like this.
First thing is finding a provider and an administrator that knows this stuff because they will be able to tailor your setup to exactly what you need.
Next Servers:
Hardware load balancer (round robin dns _can_ be used, but not as efficent)
- Lowend box
- Opensource software such as haproxy or lvs
Application Servers
- Beefy
- At least two
Image Server (depending on the amount of images.. can combine with application servers.. if you have a lot of media {images,movies,sounds,etc} I would split it)
- Split images away from application servers
- lighttpd/nginx for serving
Database Server
- hardware depends on database structure
- I would go with a raid 10 setup
All of this connected via a 1gbit backend network.
Again this is in the event you receive a TON of traffic.
The one thing I recommend to my clients that expect to get a lot of traffic is: Spend the money, get more then you need, you cannot predict the unknown
many people do not get enough infrastructure built out, it then contributes to more headaches... such as rushing to keep the site online, lost revenue, etc.
you can always downsize later.
Last edited by Steven; 08-22-2007 at 11:23 PM.
|
Posted by derek.bodner, 08-22-2007, 11:47 PM |
Without knowing your application, it's going to be hard to give you real solid advice. but here are a few things to consider:
- Application Servers:
Get two severs, load balanced (as Steven said, that's going to require a hardware load balancer, most likely). Either store the common data on a NAS, or rsync the data between the two. This will provide both failover and load balancing between those two servers. Lock these mofo's down, because as someone said above, the more exposure they get, the more they're going to get attacked. One thing you can then do is setup the two web servers to only have internal nic's. Put these behind a hardware firewall, then in order to SSH/FTP in you'd have to VPN in. Doesn't guarantee security, but will help prevent against brute force attacks.
Separate MySQL server. For DB's, the server's bottleneck will typically be HD access speed and CPU. So I'm going to tell you to get a server with a ton of RAM. Why? memcached
http://www.danga.com/memcached/
RAM's cheap, and memcached if implemented effectively can really save your you-know-what.
When this is setup, make sure your provider is monitoring the server's. Be careful, some will setup ping monitors only and tell you they're monitoring your servers. But if you have a mysql server setup, and mysql crashes, what good does the server do if it can ping but you're site's down?
The rest really depends on your site, and what it does. One thing to do is run some stress tests with AB during the overnight hours to see how you're holding up.
http://www.symfony-project.com/askeet/19
|
Posted by macker, 08-23-2007, 05:06 AM |
Steven and Derek have both offered some very good suggestions.
RAM is your best answer; not only will it help today, but it will help tomorrow. Dynamic database-driven PHP sites tend to hammer MySQL into the ground, and are what causes the site to go down; kick the RAM up to 4GB, and have someone "tune" your my.cnf, or search Google for "mysql tuning" and take a stab at it yourself.
If you're using an off-the-shelf CMS like Mambo of Joomla, then there's not much you can do on the application side.
Load balancing is a popular idea, but is usually way more complex than people want to take on. If there is significant profit revenue expected from this TV appearance, you may want to bring someone in, but give some realistic consideration to how much increased traffic your site is likely to see, for how long, and how much you stand to profit from it standing up to the full load.
I would NOT upgrade to a 100mbit port, unless you're willing to pay for the cost of the bandwidth. Assuming it's being measured at 95th percentile, you'd have to see a lot of sustained traffic (assuming your server could hold up) to get close to 100mbit, but just the same, evaluate whether it's worthwhile. This really should be as much a business decision as anything else; are you in an impulse buy scenario? Will people immediately go to your site after seeing it mentioned on TV? Will they likely come back later, if they get a temporary error? Etc.
If you were planning on upgrading anyways, now may be a good time, but I'd caution against going overboard. Be optimistic, but don't do anything that will have significant adverse impact if it turns out to have less benefit than anticipated.
|
Posted by David, 08-23-2007, 08:50 AM |
Charles,
1. Steven @ rack911 is your definite answer. He's helped us with a lot of infrastructure development / management & he's one of the best out there.
2. That's hard to gauge, it's largely dependent on the code you're utilizing & how optimized it is. A good admin. can definitely help optimize the infrastructure & daemons though.
3. Mmm, depends.
|
Posted by Dennis Nugent, 08-23-2007, 11:49 AM |
Obviously the easiest thing would be to upgrade the hardware'
Actually it depends on how much content is on your networks (jpegs, video, etc.) If you have a lot of content then the easiest thing might be to outsource the big files to a cdn
|
Posted by Steven, 08-23-2007, 12:13 PM |
We have found in multiple cases building an infrastructure to host images is cheaper in the long run when comparing costs with alkamai, cachefly, etc.
|
Add to Favourites Print this Article
Also Read