I'm running a WEB application (PHP + MariaDB) for my company, developing on my laptop and running the production version on a dedicated server.
Recently, I've started to measure performance and noticed that my laptop performs better than the server. For instance, one of the pages is generated in ~50ms on my laptop vs. ~130ms on the server. This time is measured in the PHP code using microtime(), thus network requests delays to the server are ignored.
The thing is, the server should perform better than my laptop on every single point. 130ms is OK for this application, hence the reason I've never noticed this fact. But if possible I'd like to understand what causes this.
I guess the next step would be to measure time at different points of the code, but the difference is so important (> x2.5) that I can't help thinking I'm missing something.
Here are some details.
CPU
- Laptop: Intel(R) Core(TM) i5-5200U CPU @ 2.20GHz
- Server: Intel(R) Xeon(R) CPU E3-1270 v6 @ 3.80GHz
RAM
- Laptop: 16 GB DDR3 @ 1600MHz
- Server: 32 GB DDR4 @ 2400MHz
OS
Both are running Debian 9, thus they're running the same version of Apache2, PHP, MariaDB... and should be configured more or less in the same way.
MariaDB
On the server, it's configured to use 24GB of RAM for InnoDB pool size split into 12 instances (all tables are using InnoDB). Currently, it's using only 4.5GB of RAM so disk accesses should not be an issue.
What else?
- Swap usage is null
- CPU is idle most of the time
- In particular, the measurements were made at moments where no one else was using the application
- The server is running other services (e.g., git, other WEB applications) that should not have such impact on performances, especially given the low CPU usage