1

I have an ETL process that I'm trying to deploy as a virtual machine across a few different "enterprisey" networks. I know it's relatively lightweight, but one client is very interested in the absolute minimal resources the VM will need allocated to do its job.

Other than tweaking the VM parameters until it fails to run, is there a way to measure what my CPU and memory requirements are? Can I just profile the VM process from the host machine and use those as rough estimates?

VMWare and VirtualBox are the initial targets.

  • 1
    Typically I consider an application really light weight if it still runs reasonably well with the minimum resources the OS has as a requirement. http://serverfault.com/questions/384686/can-you-help-me-with-my-capacity-planning – HBruijn Jul 23 '14 at 13:20

1 Answers1

1

You can profile the VM versus time on a real system to see what the host resources look like. You'd plan for the the lowest-common-denominator configuration between the two if you're only interested in distributing one appliance.

But keep in mind that VMware and VirtualBox are completely different virtualization solutions, and don't consume resources in the same manner.

When you mention "VMware", are you referring to VMware vSphere and ESXi or something like VMware Workstation?

ewwhite
  • 194,921
  • 91
  • 434
  • 799
  • All of the above? I'm using packer to build an image that I hope can be imported into any of those systems (http://www.packer.io/intro/platforms.html). I don't have VSphere to play with yet. – Steve Jackson Jul 23 '14 at 13:30