(Be gentle with me, I'm a developer, not a server guy)
I'm looking at some issues with custom code I have, and I'm tracking pool usage on a Windows Server 2008 machine. I'm confused about some performance counters though.
Memory\Pool Paged Bytes on the machine is 400 MB
The description of that counter says:
Pool Paged Bytes is the size, in bytes, of the paged pool, an area of system memory (physical memory used by the operating system) for objects that can be written to disk when they are not being used. Memory\Pool Paged Bytes is calculated differently than Process\Pool Paged Bytes, so it might not equal Process\Pool Paged Bytes\_Total. This counter displays the last observed value only; it is not an average.
Process(_Total)\Pool Paged Bytes is 9MB. The description for that counter says is the same as for Memory\Pool Paged Bytes:
I understand that these values are computed "Differently", but is the size of the difference indicative of any problem? Does anyone know the difference in how the values are calculated?