I'm searching for good rules of thumb to understand when NOT to virtualize a machine.
For example, I know that a fully CPU-bound process with near 100% utilization is probably not a good idea to virtualize, but is there any sense in running something which leverages the CPU most of the time a "substantial amount" (say 40 or 50%)?
Another example: if I virtualize 1000 machines, even if they are only lightly or moderately utilized, it would probably be bad to run that all on a host with only 4 cores.
Can someone summarize hints about virtualization based on machine workload or sheer number of guest machines when compared to host resources?
I typically virtualize on Windows hosts using VirtualBox or VMWare, but I'm assuming this is a pretty generic question.