4

I am expecting high traffic on a corporate website i am managing.At present the website is hosted on godaddy shared hosting.

As it will be an IPO for my clients company so i don't have any idea what kind of traffic surge there would be.

How should i plan and for what Hosting plan should i go for at godaddy or at any other hosting.

Is cloud computing is relevant for this situation.What would be the best/Cost effective solution.

The site is a very small CMS in classic ASP and MsAccess DB.

Also suggest if i have to check any programming related issues to make the site available during high traffic flawlessly.

regards, sunny

sunny
  • 149
  • 1
  • 1
  • 6
  • @Jim Zajkowski : Could you please kindly explain Wget.......I have the option of trasnferring DB to SQL server(At Godaddy Shared Hosting) would this help ? Also ..web traffic would be from Australia mostly... – sunny Nov 03 '09 at 06:51
  • sunny - jim is recommending you use wget (a linux utility) to pull down a static copy of your website and replacing your dynamic copy with a static copy of the site - this only works if you can afford to not have your dynamic content function. – epic9x Nov 03 '09 at 07:12
  • 2
    Wget is not just a Linux utility. It runs natively under Windows as well. – John Gardeniers Nov 03 '09 at 10:12
  • Dear Freinds, Everyone is recommeding static pages and wget tool for use.I am running out of time and have never worked on wget.If i go for Godaddy's dedicated Hosting Premium plan for windows http://www.godaddy.com/gdshop/hosting/dedicated-server.asp?ci=9014&display=dedicated will this work for the IPO period. Considering the coding standard Average with MS Access Db what should be the best option. Thanks again to all for support. regards, Sunny – sunny Nov 05 '09 at 13:14
  • First and formost - Dump MS Access. It has limitations on concurrent connections and will fail even on moderate traffic. – mixdev Jan 29 '12 at 00:47

7 Answers7

4

The key to surviving a massive influx in traffic is to increase the amount of concurrent requests you can handle, that means a)decrease the time it takes to render pages so you can serve more visitors quickly, or b)get a hosting platform that is capable of handling more connections.

If you expect lots of media traffic, shared hosting is not for you. At the least you should temporarily upgrade to a VPS or dedicated server - this is a critical time for your business (and you) and you don't want website and email trouble.

If you're short on time, I wouldn't recommend moving to something like cloud - you're not going to be horizontally scaling much as far as know (but I've almost no experience on that - I might be wrong). You'd also potentially have to go through changing DNS and changing hosts - which can be a traumatic experience depending on support teams on both sides. See if godaddy can you up to a dedicated server - this would provide you dedicated CPU time and ram and get you out of an environment where you're potentially going to be shut off for affecting other users. You might only be on this plan for a month or two - then you can make a decision if moving back to shared hosting is right for you.

If you have time to move a copy of your site to a dedicated server before re-pointing the DNS, you should see if you can benchmark that copy of your site before it goes live to see if you need further optimization or if throwing cash at it was enough. You can with something like apache ab if you have access to a linux machine (or can grab a cheap linux vps) - a quick guide on this can be found here: http://www.cyberciti.biz/tips/howto-performance-benchmarks-a-web-server.html

As to other optimizations, SQL server is probably faster than access, and could probably be setup on your dedicated machine or a VPS. You'll want to get the site developers involved and see if they can implement any caching or if they can make any database optimizations, as those will lower the time it takes to render a page and move onto the next visitor.

epic9x
  • 1,618
  • 10
  • 9
2

I think you need to define high traffic/volume, and whether you expect the ms access db to be a shared resource. Does the site use SSL? Without more specifics this sounds like a recipe for failure, if anything concurrent access and contention on that access db could be a serious bottleneck. If the db is a local resource only i.e no shared user table or anything of that nature then you may be able to parallelize the site across a cluster/cloud whatever. Jim recommendation above is a good step if this is true, although most access db backed websites are anything but horizontally scalable.

MattyB
  • 993
  • 4
  • 6
  • MattyB: 1. "whether you expect the ms access db to be a shared resource" MS Access DB is only conatins data for display on website.I don't get shared resource meaning.I just want to use db for the website only.No concurrent connection.I am sorry if i am not specific to your query. – sunny Nov 05 '09 at 12:52
  • I am just trying to determine whether or not the Access db in question is a dependency shared across instances of the website, i.e monolithic or conversely holds data that doesn't need to be shared across web servers. So if you had two web requests come in on two different web servers would each request need to access the same DB versus its own local copy. In any event there are some very good answers in this thread dealing with web scalability. I would emphasize that its usually an architecture/software problem not your hardware until you start serving massive volume. – MattyB Nov 05 '09 at 13:18
0

If you're expecting a surge in traffic with little warning, sign up with a CDN. Something like SimpleCDN (cheap) or MaxCDN (better, lots more expensive) can be enabled with just a credit card. You will need to make some DNS changes to handle this, and you will have to do some web server configuration to turn on caching for static assets (easy in IIS).

Then, you might want to add response.Expires to add Cache-Control headers all of your dynamic ASP pages at least short-term so the CDN can cache those too.

Finally, if you have time, dump MSaccess as the database for your site, and use the free SQL Express at the very least. There are upsizing wizards so it should be fairly painless and require minimal code change.

rmalayter
  • 3,744
  • 19
  • 27
  • These are initial steps... I would add that after you get past this hurdle, you need to use performance counters to find out where your bottlenecks really are and then optimize from there. If you don't have access to those via your shared hosting account, it's time to move up in the world to a virtual server (or cloud server) at least. – rmalayter Jan 28 '10 at 13:53
0

I concur with the others that this sounds like a recipe for disaster. With an IPO you want your company's website to be up and available for media inquiries, investors and to get your message out there.

It also sounds like you are quickly running out of time. The best solution in the long run is to obviously engineer this out. But since you are short on time, allow me to give you some quick advice.

First, the MS Access backend is going to be the first failure point of your site. You need to get everything static, and now. Use HTTrack to download your website's currently displayed code. Then take out the parts that it makes sense to take out (for example, something you are dynamically creating via your DB, stock quotes for example). Backup the files currently on your site and replace them with the "static" page that you downloaded.

Second, a VPS may or may not be sufficient for your website. It totally depends on the traffic you expect, and in some cases remaining on your shared hosting might actually be better. Sign up for a dedicated server from a trusted provider who can guarantee setup time. You will pay more but you have to ask yourself if it is worth even several hundred dollars to have your website up and available to potential investors. I believe you will find the answer to be that it is well worth it. I'm also sure you can find someone to migrate the site for you to get it done before the IPO. You will pay more, but again, it is worth it to have it done.

Dave Drager
  • 8,315
  • 28
  • 45
0

I would recommend you look to generate the site maximizing the number of static files possible (wget --mirror if you have to), and consider putting it on a edge network, like Amazon's CloudFront.

Those would be the simplest and cheapest things to do and give the most performance increase per person-hour.

Edit: I'm strongly suggesting getting away from using any database, period. If it's just a CMS there's no reason to not generate static files and serve those. e.g. Movable Type.

Jim Zajkowski
  • 1,604
  • 12
  • 11
0

Much of this will come down to your code and use of resources. If you're expecting a high volume of traffic your caching will be extremely important.

Scaling is usually NOT a hardware issue immediately. Hardware is a bandaid that some people use in an attempt to fix poorly written software.

Make as much of your content as static as you can, and put that on another host. Adding the complexity of coding for cloud computing will do less for you than having a clean streamlined website.

Stackoverflow itself is a very good example, they had one webserver until they broke 1 million page views per day. The site was very popular and survived for months on a single piece of hardware.

Look at your sites profile and any limitations you may have(like how fast can you serve webpages with your current database). Also look into tools like yslow to see where your bottlenecks are on the client side.

sclarson
  • 3,624
  • 21
  • 20
0

One thing you might wish to consider is a reverse-proxy/cache/application-management system such as Zeus's ZXTM appliance. It not only reverse-proxys/load-balances but is a really good cache and most importantly can either feedback load-levels to your web-servers and/or applications servers (thus allowing them to modify their responses based on load) or can actually modify pages coming through it to reduce load. Specifically they can change the page size/weight as load increases. The BBC website uses four/six of these appliances to manage it's entire production external web system - when there's a big news event the load goes up and the page complexity comes down, meaning the vast amount of hits are read 100% from cache. It's a really cool system.

Chopper3
  • 100,240
  • 9
  • 106
  • 238