Context: AI competition. Users upload a dll with a class implementing a provided interface, then the competition runner instantiates that class and invokes various methods on the interface.
Obviously, this means that I'm intentionally allowing users to run whatever code they like as long as they wrap it up in an entry point that looks like my interface.
At the moment this is an internal company thing so we 100% trust all employees (no discussion of this, please), but it would be lovely to be able to make it more public.
Is it possible to make a system like this safe? How sandboxed can I make the dll's be? It's currently running in Azure. I assume that the box is backed up by some automated Azure stuff? (I didn't set up the Azure box)
A) Can I prevent them from interacting with anything else on the box? B) Can I prevent them from accessing anything outside the Azure box? C) Can I prevent them from accessing the Azure backups (i.e. If I say that I don't care if they trash the box, 'cos I could restore from backup, then is that safe?)