3

I've moved from a job where developers were encouraged to use technologies like docker and vagrant to create VM's on their workstations for testing and development.

At my new job, the IT manager insists that allowing a user to create a VM will compromise the security of the entire company. I know of many companies (like puppetlabs, and opscode) that allow vagrant and docker inside their company.

In what ways would allowing users to create VM's increase risk?
Is there a way to allow VMs that minimizes risk? (Perhaps use approved images)

Update

Primarily focused on the risk of allowing full virtualization on workstations (virtualbox, vagrant, hyper-v, kvm)

spuder
  • 133
  • 7
  • 3
    Perhaps you need to differentiate between "Virtual Machines" and "Virtual Containers" - Virtual Machines allow you to install OS's, Virtual Containers allow you to augment the existing OS. Suddenly dropping a Windows OS into a all-Linux environment could drastically change the security of the network. – schroeder Apr 17 '15 at 20:36

1 Answers1

1

It all comes down to the network segmentation of the VM or container from your host operating system. If you have a guest virtual machine that contains a vulnerable operating system or an container that is hosting an application that is not properly maintained (i.e. patches), assuming that either of this two connect to your corporate network via a bridge network or shared network then you run the possibility of an attack that could penetrate the internal network. If proper steps are taken to segment the VMs or containers network from the host network (i.e host only or custom NAT networks) then there shouldn't be any issues with having users create their own VMs. This is assuming to you have a way to auto-force this settings because I wouldn't trust a process or a human to stick to the proper network enforcement.

dandaman12
  • 187
  • 1
  • 1
  • 8