0
I'm using several shared hosting services and wondered many times: how do they calculate those "CPU seconds"?
For example one of those limiting me with 300,000 seconds/month, 10,000 seconds/24h and 2,000 sec/2h. But the seconds can significantly depends on hoster's hardware and the software (both: my applications and hoster's OS).
So I mostly sure that if I run some complicated and probably bad optimized SQL request which lasts 10 seconds I will probably "spend" exactly 10 CPU seconds. No questions.
But if I put a delay in a PHP script (<?php sleep(10); ?>
) it will cost me same 10 CPU seconds? Or if I downloading external webpage and it lasts for 3 seconds - will it be the same in this case?
Mainly I interested in PHP's file_get_contents()
CPU consumption.