I have my setup like this:
Two separate instances of the database on a clustered environment. Both instances are on dedicated servers each carries up to the standard configurations.
Instance one (R1) is used by an application which do day-to-day general purpose tasks for a certain client (this instance has read/write permissions).
Instance two (R2) used by a BI (Business intelligent) tool that generate large reports, data mining, and future referencing data (this instance has readonly permissions). This required heavy server load and CPU/Memory consumption (based on regular monitoring)
The server that running BI tool is also has pretty decent specifications and has allocated 12GB of memory as we thought the bottleneck was there.
Even though we allocate 12GB of memory to BI server, R2 seems lacking the performance.
Our database has more than 100 tables and primary keys are based on UUID. So my question is what are the best ways to optimize the database? Does indexing will give a performance enhancement?
Or what are the best tuning tips or tools out there..
Postgresql database version 9.1