When it comes to storing my data "in the cloud" (aka: on someone else's server), I alway have kind of a bad feeling that something like "Google's deleted an artist's blog, along with 14 years of his work" might happen to my data, too.
On the other hand, even big companies like Microsoft store lots of source code on GitHub.
My question:
What is the usual policy of companies storing their source code on external servers when it comes to minimizing risks of data loss?
E.g.
- They could rely on GitHub making enough backups.
- They could have a policy to always store data in local data centers before publishing to GitHub.
- They could have special contracts with GitHub to get additional backups.
- They could fetch the data through GitHub APIs and store it locally.
- …
So actually I'm just trying to understand why/how Microsoft (or other companies) can publish code to their public GitHub repositories and which security strategies they are applying to protect themself from data loss.
Maybe my question is to some degree opinion based, on the other hand there could be a chance that someone from Microsoft (or other similar companies) reads here and can actually answer that question.
Update 1
I'm not asking on how Git(Hub) technically allows you to distribute your source code. I do hope that I understand most of these concepts.
I'm more asking how to convince management from a security point-of-view to allow their intellectual property (i.e. source code) being stored on external servers by an external company.
And since lots of companies, including big ones like Microsoft actually do use GitHub, I'm interested on how they deal with this.
Update 2
I've added some extra words to hopefully make it more clear that I'm mostly concerned about data loss.
Don't know if this is still security related, though.