1

I have a website where the number of visitors is increasing and it is facing a poor response time. I am not a server specialist and for this reason I want to expose my issue here to receive some feedback. The website has the following characteristics:

  • Developed with ASP.NET Web Forms (I will be implementing a new version with ASP.NET MVC soon).
  • It uses a PostgreSQL 9.1 database.
  • It is 100% dynamic which means that every page is formed at run-time getting information from the DB.
  • Hosted in a cloud server with 2 Cores and 4 GB of RAM.
  • Windows Server 2012 in drive C:
  • PostgreSQL DB in another "hard disk" as drive D:
  • The average number of daily visitors is 2,500

Watching the Task Manager I can notice that there are many PostgreSQL processes which consume CPU 100%. Another process that is shown in the "IIS Worker Process" but I have not seen it as critical as the PostgreSQL process. So it seems that the main issue here has to do with the database access. When CPU usage is high the website response is extremely low and also it is very difficult to establish a remote desktop connection.

I have thought about upgrading the cloud server to, for example, 4 cores and 8 GB of RAM, but firstly I want to get feedback from experts here so I can take the best approach. I am not a server expert and, for this reason, I do not want to take steps blindly.

Arun Vinoth - MVP
  • 314
  • 1
  • 3
  • 15
JORGE
  • 111
  • 1

1 Answers1

2

It's hard to say what the exact problem is with out providing some type of query examples, but I would first suggest having a DBA review your common queries to make sure that you're not missing any indexes. Firing off a bunch of full table scans on a large dataset can cause significant performance issues.

My second suggestion is to review some type of solution for caching. Specifically, full page cache. Something like Varnish will allow you to cache the entire rendered page so you don't have to execute duplicate, expensive queries. https://varnish-cache.org/ - remember this is a case by case basis, and you should not expect to cache every page on your website for obvious reasons, unless there is no dynamic content.

Lastly, you can consider configuring master / slave replication. This will allow you to create n replicas to handle the increasing read workload that you're experiencing. Please make sure you do enough research on how to maintain replication, as it will likely add some overhead to your DB administration tasks

Increasing your database instance size will prove to have diminishing returns. If there is a SQL performance issue I suggest solving this first.

QuentinMoss
  • 822
  • 7
  • 15